Heterogeneous Graph Transformer

被引:953
作者
Hu, Ziniu [1 ]
Dong, Yuxiao [2 ]
Wang, Kuansan [2 ]
Sun, Yizhou [1 ]
机构
[1] Univ Calif Los Angeles, Los Angeles, CA 90024 USA
[2] Microsoft Res, Redmond, WA USA
来源
WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020) | 2020年
基金
美国国家科学基金会;
关键词
Graph Neural Networks; Heterogeneous Information Networks; Representation Learning; Graph Embedding; Graph Attention;
D O I
10.1145/3366423.3380027
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent years have witnessed the emerging success of graph neural networks (GNNs) for modeling structured data. However, most GNNs are designed for homogeneous graphs, in which all nodes and edges belong to the same types, making it infeasible to represent heterogeneous structures. In this paper, we present the Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs. To model heterogeneity, we design node- and edge-type dependent parameters to characterize the heterogeneous attention over each edge, empowering wir to maintain dedicated representations for different types of nodes and edges. To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm-HGSampling-for efficient and scalable training. Extensive experiments on the Open Academic Graph of 179 million nodes and 2 billion edges show that the proposed HGT model consistently outperforms all the state-of-the-art GNN baselines by 9%-21% on various downstream tasks. The dataset and source code of HGT are publicly available at https://github.com/acbull/pyHGT.
引用
收藏
页码:2704 / 2710
页数:7
相关论文
共 24 条
[1]  
[Anonymous], 2017, KDD'17
[2]  
[Anonymous], 2011, VLDB 11
[3]  
[Anonymous], 2017, ICLR 17
[4]  
Chen Jie, 2018, ICLR 18
[5]  
Fey Matthias, 2019, ICLR 2019 WORKSH REP
[6]  
Hamilton William L., 2017, NEURIPS 17
[7]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[8]  
Li H., 2014, Learning to Rank for Information Retrieval and Natural Language Processing, DOI [10.2200/S00607ED2V01Y201410HLT026, DOI 10.2200/S00607ED2V01Y201410HLT026]
[9]  
Liu TY, 2011, LEARNING TO RANK FOR INFORMATION RETRIEVAL, P267, DOI 10.1007/978-3-642-14267-3_22
[10]  
Loshchilov Ilya, 2017, ICLR 17