Heterogeneous Graph Neural Architecture Search

被引:11
作者
Gao, Yang [1 ,6 ]
Zhang, Peng [2 ]
Li, Zhao [3 ]
Zhou, Chuan [4 ,6 ]
Liu, Yongchao [5 ]
Hu, Yue [1 ,6 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Guangzhou Univ, Cyberspace Inst Adv Technol, Guangzhou, Peoples R China
[3] Alibaba Grp, Hangzhou, Peoples R China
[4] Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China
[5] Ant Grp, Hangzhou, Peoples R China
[6] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
来源
2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021) | 2021年
关键词
neural architecture search; heterogeneous network; graph neural networks;
D O I
10.1109/ICDM51629.2021.00124
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Heterogeneous Graph Neural networks (HGNNs) have been popularly used in processing complicated networks such as academic networks, social networks, and knowledge graphs. Despite their success, the design of the neural architectures of HGNNs still requires rich domain knowledge and heavy manual work. In this paper, we propose a Heterogeneous Graph Neural Architecture Search algorithm (HGNAS for short) which enables automatic design of the best neural architectures with minimal human effort. Specifically, HGNAS first defines a general HGNN framework on top of existing popular HGNNs. A search space of HGNAS is designed based on the general framework that includes multiple groups of message encoding and aggregation functions. Then, HGNAS uses a policy network as the controller to sample and find the best neural architecture from the designed search space by maximizing the expected accuracy of the selected architectures on a validation dataset. Moreover, we introduce effective methods to improve HGNAS from three aspects, i.e., the optimization of hyper-parameters, the improvement of search space, and the selection of message receptive fields. Experiments on public datasets show that HGNAS is capable of designing novel HGNNs that rival the best human-invented HGNNs. More interestingly, HGNAS finds some sparse yet powerful neural architectures for HGNNs on the benchmark datasets.
引用
收藏
页码:1066 / 1071
页数:6
相关论文
共 22 条
[1]  
Bengio Y., 2018, ICLR
[2]   MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding [J].
Fu, Xinyu ;
Zhang, Jiani ;
Men, Ziqiao ;
King, Irwin .
WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, :2331-2341
[3]  
Gao Y, 2020, PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P1403
[4]  
Hamilton WL, 2017, ADV NEUR IN, V30
[5]   Genetic Meta-Structure Search for Recommendation on Heterogeneous Information Network [J].
Han, Zhenyu ;
Xu, Fengli ;
Shi, Jinghan ;
Shang, Yu ;
Ma, Haorui ;
Hui, Pan ;
Li, Yong .
CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, :455-464
[6]   Heterogeneous Graph Transformer [J].
Hu, Ziniu ;
Dong, Yuxiao ;
Wang, Kuansan ;
Sun, Yizhou .
WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, :2704-2710
[7]  
Kipf TN, 2017, ARXIV
[8]   Policy-GNN: Aggregation Optimization for Graph Neural Networks [J].
Lai, Kwei-Herng ;
Zha, Daochen ;
Zhou, Kaixiong ;
Hu, Xia .
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, :461-471
[9]   SGAS: Sequential Greedy Architecture Search [J].
Li, Guohao ;
Qian, Guocheng ;
Delgadillo, Itzel C. ;
Mueller, Matthias ;
Thabet, Ali ;
Ghanem, Bernard .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, :1617-1627
[10]   Modeling Relational Data with Graph Convolutional Networks [J].
Schlichtkrull, Michael ;
Kipf, Thomas N. ;
Bloem, Peter ;
van den Berg, Rianne ;
Titov, Ivan ;
Welling, Max .
SEMANTIC WEB (ESWC 2018), 2018, 10843 :593-607