EGNN: Graph structure learning based on evolutionary computation helps more in graph neural networks

被引:77
作者
Liu, Zhaowei [1 ]
Yang, Dong [1 ]
Wang, Yingjie [1 ]
Lu, Mingjie [1 ]
Li, Ranran [1 ]
机构
[1] Yantai Univ, Yantai 264005, Shandong, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Evolutionary computation; Graph representation learning; Graph structure learning;
D O I
10.1016/j.asoc.2023.110040
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, graph neural networks (GNNs) have been successfully applied in many fields due to their characteristics of neighborhood aggregation and have achieved state-of-the-art performance. While most GNNs process graph data, the original graph data is frequently noisy or incomplete, resulting in suboptimal GNN performance. In order to solve this problem, a Graph Structure Learning (GSL) method has recently emerged to improve the performance of graph neural networks by learning a graph structure that conforms to the ground truth. However, the current strategy of GSL is to iteratively optimize the optimal graph structure and a single GNN, which will encounter several problems in training, namely vulnerability and overfitting. A novel GSL approach called evolutionary graph neural network (EGNN) has been introduced in this work in order to improve defense against adversarial attacks and enhance GNN performance. Unlike the existing GSL method, which optimizes the graph structure and enhances the parameters of a single GNN model through alternating training methods, evolutionary theory has been applied to graph structure learning for the first time in this work. Specifically, different graph structures generated by mutation operations are used to evolve a set of model parameters in order to adapt to the environment (i.e., to improve the classification performance of unlabeled nodes). An evaluation mechanism is then used to measure the quality of the generated samples in order to retain only the model parameters (progeny) with good performance. Finally, the progeny that adapt to the environment are retained and used for further optimization. Through this process, EGNN overcomes the instability of graph structure learning and always evolves the best progeny, providing new solutions for the advancement and development of GSL. Extensive experiments on various benchmark datasets demonstrate the effectiveness of EGNN and the benefits of evolutionary computation-based graph structure learning.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Parameterized Hypercomplex Graph Neural Networks for Graph Classification
    Le, Tuan
    Bertolini, Marco
    Noe, Frank
    Clevert, Djork-Arne
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III, 2021, 12893 : 204 - 216
  • [22] Graph-based Recommendation using Graph Neural Networks
    Dossena, Marco
    Irwin, Christopher
    Portinale, Luigi
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 1769 - 1774
  • [23] Graph Clustering with Graph Neural Networks
    Tsitsulin, Anton
    Palowitch, John
    Perozzi, Bryan
    Mueller, Emmanuel
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [24] Adaptive dependency learning graph neural networks
    Sriramulu, Abishek
    Fourrier, Nicolas
    Bergmeir, Christoph
    INFORMATION SCIENCES, 2023, 625 : 700 - 714
  • [25] Learning Ice Accretion with Graph Neural Networks
    Shumilin, S.
    LOBACHEVSKII JOURNAL OF MATHEMATICS, 2022, 43 (10) : 2887 - 2892
  • [26] GNES: Learning to Explain Graph Neural Networks
    Gao, Yuyang
    Sun, Tong
    Bhatt, Rishab
    Yu, Dazhou
    Hong, Sungsoo
    Zhao, Liang
    2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021), 2021, : 131 - 140
  • [27] Domination based graph neural networks
    Meybodi, Mohsen Alambardar
    Safari, Mahdi
    Davoodijam, Ensieh
    International Journal of Computers and Applications, 2024, 46 (11) : 998 - 1005
  • [28] A surrogate evolutionary neural architecture search algorithm for graph neural networks
    Liu, Yang
    Liu, Jing
    APPLIED SOFT COMPUTING, 2023, 144
  • [29] Investigating Transfer Learning in Graph Neural Networks
    Kooverjee, Nishai
    James, Steven
    van Zyl, Terence
    ELECTRONICS, 2022, 11 (08)
  • [30] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    Artificial Intelligence Review, 2023, 56 : 2263 - 2310