Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth

被引:0
作者
Gao, Jianliang [1 ]
He, Changlong [1 ]
Chen, Jiamin [1 ]
Li, Qiutong [1 ]
Wang, Yili [1 ]
机构
[1] Cent South Univ, Changsha, Peoples R China
来源
35TH INTERNATIONAL CONFERENCE ON SCIENTIFIC AND STATISTICAL DATABASE MANAGEMENT, SSDBM 2023 | 2023年
基金
中国国家自然科学基金;
关键词
Decoupled Graph neural networks; Graph neural architecture search learning; Propagation operation;
D O I
10.1145/3603719.3603729
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To alleviate the over-smoothing problem caused by deep graph neural networks, decoupled graph neural networks (DGNNs) are proposed. DGNNs decouple the graph neural network into two atomic operations, the propagation (P) operation and the transformation (T) operation. Since manually designing the architecture of DGNNs is a time-consuming and expert-dependent process, the DF-GNAS method is designed, which can automatically construct the architecture of DGNNs with fixed propagation operation and deep layers. The propagation operation is a key process for DGNNs to aggregate graph structure information. However, DF-GNAS automatically designs DGNN architecture using fixed propagation operation for different graph structures will cause performance loss. Meanwhile, DF-GNAS designs deep DGNNs for graphs with simple distributions, which may lead to overfitting problems. To solve the above challenges, we propose the Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth (DGNAS-PD) method. In DGNAS-PD, we design a DGNN operation space with variable efficient propagation operations in order to better aggregate information on different graph structures. We build an effective genetic search strategy to adaptively design appropriate DGNN depths instead of deep DGNNs for the graph with simple distributions in DGNAS-PD. The experiments on five real-world graphs show that DGNAS-PD outperforms state-of-art baseline methods.
引用
收藏
页数:4
相关论文
共 12 条
[1]  
Frasca F, 2020, Arxiv, DOI arXiv:2004.11198
[2]  
Gao Y, 2019, Arxiv, DOI arXiv:1904.09981
[3]  
Hamilton WL, 2017, ADV NEUR IN, V30
[4]   Label Informed Attributed Network Embedding [J].
Huang, Xiao ;
Li, Jundong ;
Hu, Xia .
WSDM'17: PROCEEDINGS OF THE TENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2017, :731-739
[5]   Towards Deeper Graph Neural Networks [J].
Liu, Meng ;
Gao, Hongyang ;
Ji, Shuiwang .
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, :338-348
[6]  
Maurya SK, 2021, Arxiv, DOI arXiv:2105.07634
[7]  
Kipf TN, 2017, Arxiv, DOI arXiv:1609.02907
[8]  
Pei HB, 2020, Arxiv, DOI arXiv:2002.05287
[9]  
Veličkovic P, 2018, Arxiv, DOI [arXiv:1710.10903, 10.48550/arXiv.1710.10903, DOI 10.48550/ARXIV.1710.10903]
[10]   DEMO-Net: Degree-specific Graph Neural Networks for Node and Graph Classification [J].
Wu, Jun ;
He, Jingrui ;
Xu, Jiejun .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :406-415