Decoupled graph neural architecture search with explainable variable propagation operation

被引:0
|
作者
He, Changlong [1 ]
Chen, Jiamin [1 ]
Li, Qiutong [1 ]
Wang, Yili [1 ]
Gao, Jianliang [1 ]
机构
[1] Cent South Univ, Sch Comp Sci & Engn, Changsha, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Decoupled Graph neural network; Graph neural architecture search; Propagation operation; Evolutionary algorithm;
D O I
10.1007/s10115-024-02329-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most graph neural networks (GNNs) suffer from the over-smoothing problem which limits further improvement of performance. Hence, many studies have decoupled the GNN into two atomic operations, the propagation (P) operation and the transformation (T) operation to propose a paradigm named decoupled graph neural networks (DGNNs) for alleviating this problem. Since manually designing the architecture of DGNNs is time-consuming and expert-dependent, the decoupled graph neural architecture search (DGNAS) methods were proposed and achieved success. However, existing DGNAS methods lack explanation in the design of DGNN architecture with adaptive variable P operation, which hinders researchers from further exploring DGNAS methods. In addition, the naive evolutionary search algorithm used by previous DGNAS methods lacks constraints on the search direction, limiting its search efficiency in exploring the DGNN architecture. To address the above challenges, we propose the decoupled graph neural architecture search with explainable variable propagation operation (DGNAS-EP) method. Specifically, we propose the mean distinguishability (MD) metric to measure the distinguishable state of node representation, which effectively explains the significance of why the DGNAS method should build DGNN architectures with variable P operation. Graphs with different distributions require different P operations in DGNN architecture to adaptively adjust, thus obtaining the optimal MD, which is very important for improving the performance of DGNNs. Furthermore, DGNAS-EP utilizes the explored historical DGNN architectures as prior knowledge to constrain the search direction based on evolutionary state, which effectively improves the search efficiency of the DGNAS method. The experiments on real-world graphs show that our proposed method DGNAS-EP outperforms state-of-the-art baseline methods. Codes are available at https://github.com/frankdoge/DGNAS-EP.git.
引用
收藏
页码:4677 / 4702
页数:26
相关论文
共 50 条
  • [1] Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth
    Gao, Jianliang
    He, Changlong
    Chen, Jiamin
    Li, Qiutong
    Wang, Yili
    35TH INTERNATIONAL CONFERENCE ON SCIENTIFIC AND STATISTICAL DATABASE MANAGEMENT, SSDBM 2023, 2023,
  • [2] Decoupled differentiable graph neural architecture search
    Chen, Jiamin
    Gao, Jianliang
    Wu, Zhenpeng
    Al-Sabri, Raeed
    Oloulade, Babatounde Moctard
    INFORMATION SCIENCES, 2024, 673
  • [3] Depth-adaptive graph neural architecture search for graph classification
    Wu, Zhenpeng
    Chen, Jiamin
    Al-Sabri, Raeed
    Oloulade, Babatounde Moctard
    Gao, Jianliang
    KNOWLEDGE-BASED SYSTEMS, 2024, 301
  • [4] Knowledge-aware evolutionary graph neural architecture search
    Wang, Chao
    Zhao, Jiaxuan
    Li, Lingling
    Jiao, Licheng
    Liu, Fang
    Liu, Xu
    Yang, Shuyuan
    KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [5] Asymmetric augmented paradigm-based graph neural architecture search
    Wu, Zhenpeng
    Chen, Jiamin
    Al-Sabri, Raeed
    Oloulade, Babatounde Moctard
    Gao, Jianliang
    INFORMATION PROCESSING & MANAGEMENT, 2025, 62 (01)
  • [6] Graph neural architecture search with heterogeneous message-passing mechanisms
    Wang, Yili
    Chen, Jiamin
    Li, Qiutong
    He, Changlong
    Gao, Jianliang
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (07) : 4283 - 4308
  • [7] Adaptive multi-scale Graph Neural Architecture Search framework
    Yang, Lintao
    Lio, Pietro
    Shen, Xu
    Zhang, Yuyang
    Peng, Chengbin
    NEUROCOMPUTING, 2024, 599
  • [8] On the Equivalence of Decoupled Graph Convolution Network and Label Propagation
    Dong, Hande
    Chen, Jiawei
    Feng, Fuli
    He, Xiangnan
    Bi, Shuxian
    Ding, Zhaolin
    Cui, Peng
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 3651 - 3662
  • [9] AutoMTNAS: Automated meta-reinforcement learning on graph tokenization for graph neural architecture search
    Nie, Mingshuo
    Chen, Dongming
    Chen, Huilin
    Wang, Dongqi
    KNOWLEDGE-BASED SYSTEMS, 2025, 310
  • [10] Efficient graph neural architecture search using Monte Carlo Tree search and prediction network
    Deng, TianJin
    Wu, Jia
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213