Heterogeneous Graph Neural Architecture Search

被引:11
作者
Gao, Yang [1 ,6 ]
Zhang, Peng [2 ]
Li, Zhao [3 ]
Zhou, Chuan [4 ,6 ]
Liu, Yongchao [5 ]
Hu, Yue [1 ,6 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Guangzhou Univ, Cyberspace Inst Adv Technol, Guangzhou, Peoples R China
[3] Alibaba Grp, Hangzhou, Peoples R China
[4] Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China
[5] Ant Grp, Hangzhou, Peoples R China
[6] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
来源
2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021) | 2021年
关键词
neural architecture search; heterogeneous network; graph neural networks;
D O I
10.1109/ICDM51629.2021.00124
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Heterogeneous Graph Neural networks (HGNNs) have been popularly used in processing complicated networks such as academic networks, social networks, and knowledge graphs. Despite their success, the design of the neural architectures of HGNNs still requires rich domain knowledge and heavy manual work. In this paper, we propose a Heterogeneous Graph Neural Architecture Search algorithm (HGNAS for short) which enables automatic design of the best neural architectures with minimal human effort. Specifically, HGNAS first defines a general HGNN framework on top of existing popular HGNNs. A search space of HGNAS is designed based on the general framework that includes multiple groups of message encoding and aggregation functions. Then, HGNAS uses a policy network as the controller to sample and find the best neural architecture from the designed search space by maximizing the expected accuracy of the selected architectures on a validation dataset. Moreover, we introduce effective methods to improve HGNAS from three aspects, i.e., the optimization of hyper-parameters, the improvement of search space, and the selection of message receptive fields. Experiments on public datasets show that HGNAS is capable of designing novel HGNNs that rival the best human-invented HGNNs. More interestingly, HGNAS finds some sparse yet powerful neural architectures for HGNNs on the benchmark datasets.
引用
收藏
页码:1066 / 1071
页数:6
相关论文
共 22 条
[21]  
Zhou K., 2019, AUTO GNN NEURAL ARCH
[22]  
Zhou K., 2019, AUTO GNN NEURAL ARCH