Heterogeneous Information Network Representation Learning Framework Based on Graph Attention Network

被引:6
作者
Kang Shize [1 ]
Ji Lixin [1 ]
Zhang Jianpeng [1 ]
机构
[1] Informat Engn Univ, Zhengzhou 450001, Peoples R China
基金
中国国家自然科学基金;
关键词
Heterogeneous information network; Knowledge graph; Graph attention network; Representation learning;
D O I
10.11999/JEIT200034
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Commonly used heterogeneous information networks include knowledge graphs and heterogeneous information networks with simple schemas. Their representation learning follows usually different methods. The similarities and differences between knowledge graphs and heterogeneous information networks with simple schemas are summarized, and a general heterogeneous information network representation learning framework is proposed. The proposed framework can be divided into three parts: the basic vector model, the graph attention network based propagation model, and the task model. The basic vector model is used to learn basic network vector; The propagation model learns the high-order neighbor features of the network by stacking attention layers. The replaceable task module is suitable for different application scenarios. Compared with the benchmark model, the proposed framework achieves relatively good results in the link prediction task of the knowledge graph and the node classification task of the heterogeneous information network.
引用
收藏
页码:915 / 922
页数:8
相关论文
共 14 条
[1]  
Bordes A., 2013, Advances in Neural Information Processing Systems, V26, P2787, DOI DOI 10.5555/2999792.2999923
[2]  
Chen K., 2013, INT C LEARNING REPRE, P1
[3]  
Dettmers T, 2018, AAAI CONF ARTIF INTE, P1811
[4]   metapath2vec: Scalable Representation Learning for Heterogeneous Networks [J].
Dong, Yuxiao ;
Chawla, Nitesh V. ;
Swami, Ananthram .
KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, :135-144
[5]  
Nguyen HT, 2018, ASIA S PACIF DES AUT, P327, DOI 10.1109/ASPDAC.2018.8297343
[6]  
King DB, 2015, ACS SYM SER, V1214, P1, DOI 10.1021/bk-2015-1214.ch001
[7]  
Nathani D, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P4710
[8]  
Perozzi B, 2014, PROC 20 ACM SIGKDD, V20, P701, DOI DOI 10.1145/2623330.2623732
[9]  
Shang C, 2019, AAAI CONF ARTIF INTE, P3060
[10]  
SHANG Jingbo, 2016, ARXIV161009769V1