Graph ensemble neural network

被引:4
|
作者
Duan, Rui [1 ]
Yan, Chungang [2 ,3 ]
Wang, Junli [2 ,3 ]
Jiang, Changjun [2 ,3 ]
机构
[1] Guangzhou Univ, Sch Comp Sci & Cyber Engn, Guangzhou 510000, Peoples R China
[2] Minist Educ, Key Lab Embedded Syst & Serv Comp, Shanghai 201804, Peoples R China
[3] Tongji Univ, Natl Prov Minist Joint Collaborat Innovat Ctr Fina, Shanghai 201804, Peoples R China
关键词
Graph neural network; Data augmentation; Ensemble learning; Heterophily graphs;
D O I
10.1016/j.inffus.2024.102461
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble methods have been shown to improve graph neural networks (GNNs). Existing ensemble methods on graphs determine a strong classifier by combining a set of trained base classifiers, i.e., combining the final outputs of base classifiers for prediction. However, these methods fail to promote many popular GNNs to perform well under heterophily (in graphs where many connected nodes have different class labels), which limits their applicability. Furthermore, they ignore the hierarchical nature of GNNs, which results in no interaction between base classifiers when neighbors are aggregated (during training). Two issues arise from this: low applicability and shallow ensemble . We propose Graph Ensemble Neural Network (GEN) for addressing above issues, which is not a simple ensemble of GNNs, but instead integrates ensemble into GNNs to fuse a set of graphs. GEN deepens single ensemble into multiple ensembles during training and applies to homophily and heterophily graphs. In GEN, we design structure augmentation to generate some graphs for training and design feature augmentation for attenuating errors brought by the initial features. Different from existing graph ensemble methods that execute only one ensemble, GEN executes multiple deep ensembles throughout the neighbor aggregation to fuse multiple graphs generated by structure augmentation. Extensive experiments show that GEN achieves new state-of-the-art performance on homophily and heterophily graphs for the semi- and full -supervised node classification. The source code of GEN is publicly available at https://github.com/graphNN/GEN1.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Graph neural network for groundwater level forecasting
    Bai, Tao
    Tahmasebi, Pejman
    JOURNAL OF HYDROLOGY, 2023, 616
  • [42] Relational graph neural network for situation recognition
    Jing, Ya
    Wang, Junbo
    Wang, Wei
    Wang, Liang
    Tan, Tieniu
    PATTERN RECOGNITION, 2020, 108 (108)
  • [43] A comprehensive survey on graph neural network accelerators
    Liu, Jingyu
    Chen, Shi
    Shen, Li
    FRONTIERS OF COMPUTER SCIENCE, 2025, 19 (02)
  • [44] Learning solid dynamics with graph neural network
    Li, Bohao
    Du, Bowen
    Ye, Junchen
    Huang, Jiajing
    Sun, Leilei
    Feng, Jinyan
    INFORMATION SCIENCES, 2024, 676
  • [45] Ego-Aware Graph Neural Network
    Dong, Zhihao
    Chen, Yuanzhu
    Tricco, Terrence S.
    Li, Cheng
    Hu, Ting
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (02): : 1756 - 1770
  • [46] Heterogeneous graph neural network for attribute completion
    Wang, Kai
    Yu, Yanwei
    Huang, Chao
    Zhao, Zhongying
    Dong, Junyu
    KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [47] GraphNILM: A Graph Neural Network for Energy Disaggregation
    Shang, Rui
    Chen, Siji
    Chen, Zhiqian
    Lu, Chang-Tien
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT II, PAKDD 2024, 2024, 14646 : 431 - 443
  • [48] Graph neural network approach for anomaly detection
    Xie, Lingqiang
    Pi, Dechang
    Zhang, Xiangyan
    Chen, Junfu
    Luo, Yi
    Yu, Wen
    MEASUREMENT, 2021, 180
  • [49] Quantifying uncertainty in graph neural network explanations
    Jiang, Junji
    Ling, Chen
    Li, Hongyi
    Bai, Guangji
    Zhao, Xujiang
    Zhao, Liang
    FRONTIERS IN BIG DATA, 2024, 7
  • [50] Search to aggregate neighborhood for graph neural network
    Zhao, Huan
    Yao, Quanming
    Tu, Weiwei
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 552 - 563