Adaptive multi-scale Graph Neural Architecture Search framework

被引:0
作者
Yang, Lintao [1 ]
Lio, Pietro [2 ]
Shen, Xu [1 ]
Zhang, Yuyang [1 ]
Peng, Chengbin [1 ]
机构
[1] Ningbo Univ, Fac Elect Engn & Comp Sci, Key Lab Mobile Network Applicat Technol Zhejiang P, Ningbo 315211, Peoples R China
[2] Univ Cambridge, Dept Comp Sci & Technol, Cambridge CB3 0FD, England
关键词
Graph neural networks; Graph Neural Architecture Search; Graph representation learning; NETWORKS;
D O I
10.1016/j.neucom.2024.128094
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have gained significant attention for their ability to learn representations from graph -structured data, in which message passing and feature fusion strategies play an essential role. However, traditional Graph Neural Architecture Search (GNAS) mainly focuses on optimization with a static perceptive field to ease the search process. To efficiently utilize latent relationships between non -adjacent nodes as well as edge features, this work proposes a novel two -stage approach that is able to optimize GNN structures more effectively by adaptively aggregating neighborhood features in multiple scales. This adaptive multi -scale GNAS is able to assign optimal weights for different neighbors in different graphs and learning tasks. In addition, it takes latent relationships and edge features into message passing into account, and can incorporate different feature fusion strategies. Compared with traditional ones, our proposed approach can explore a much larger and more diversified search space efficiently. We also prove that traditional multi -hop GNNs are low-pass filters, which can lead to the removal of important low -frequency components of signals from remote neighbors in a graph, and they are not even expressive enough to distinguish some simple regular graphs, justifying the superiority of our approach. Experiments with seven datasets across three graph learning tasks, including graph regression, node classification, and graph classification, demonstrate that our method yields significant improvement compared with state-of-the-art GNAS approaches and human -designed GNN approaches. Specifically, for example, with our framework, the MAE of the 12 -layer AM-GNAS was 0.102 for the ZINC dataset, yielding over 25% improvement.
引用
收藏
页数:10
相关论文
共 71 条
  • [1] Abu-El-Haifa S, 2019, PR MACH LEARN RES, V97
  • [2] Baek J., 2021, INT C LEARN REPR
  • [3] Beaini Dominique, 2021, P INT C MACH LEARN, P748
  • [4] Bianchi FM, 2020, PR MACH LEARN RES, V119
  • [5] Automatic Relation-aware Graph Network Proliferation
    Cai, Shaofei
    Li, Liang
    Han, Xinzhe
    Luo, Jiebo
    Zha, Zheng-Jun
    Huang, Qingming
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10853 - 10863
  • [6] Rethinking Graph Neural Architecture Search from Message-passing
    Cai, Shaofei
    Li, Liang
    Deng, Jincan
    Zhang, Beichen
    Zha, Zheng-Jun
    Su, Li
    Huang, Qingming
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 6653 - 6662
  • [7] Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation
    Chen, Xin
    Xie, Lingxi
    Wu, Jun
    Tian, Qi
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1294 - 1303
  • [8] Chien EL, 2021, Arxiv, DOI arXiv:2006.07988
  • [9] Choi Yun Young, 2023, 11 INT C LEARN REPR
  • [10] Corso G, 2020, ADV NEUR IN, V33