Heterogeneous Graph Neural Networks using Self-supervised Reciprocally Contrastive Learning

被引:1
作者
Huo, Cuiying [1 ]
He, Dongxiao [1 ]
Li, Yawen [2 ]
Jin, Di [1 ]
Dang, Jianwu [3 ]
Pedrycz, Witold [4 ]
Wu, Lingfei [5 ]
Zhang, Weixiong [6 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin, Peoples R China
[2] Beijing Univ Posts & Telecommun, Sch Econ & Management, Beijing, Peoples R China
[3] Chinese Acad Sci, Shenzhen Inst Adv Technol, Shenzhen, Peoples R China
[4] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB, Canada
[5] Anytime AI, New York, NY USA
[6] Hong Kong Polytech Univ, Dept Hlth Technol & Informat, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Heterogeneous graphs; Graph neural networks; Representation learning; Contrastive learning; Network noise;
D O I
10.1145/3706115
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Heterogeneous graph neural network (HGNN) is a popular technique for modeling and analyzing heterogeneous graphs. Most existing HGNN-based approaches are supervised or semi-supervised learning methods requiring graphs to be annotated, which is costly and time-consuming. Self-supervised contrastive learning has been proposed to address the problem of requiring annotated data by mining intrinsic properties in the given data. However, the existing contrastive learning methods are not suitable for heterogeneous graphs because they construct contrastive views only based on data perturbation or pre-defined structural properties (e.g., meta-path) in graph data while ignoring noises in node attributes and graph topologies. We develop a robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidances of node attributes and graph topologies and integrates and enhances them by a reciprocally contrastive mechanism to better model heterogeneous graphs. In this new approach, we adopt distinct but suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately. We further use both attribute similarity and topological correlation to construct high-quality contrastive samples. Extensive experiments on four large real-world heterogeneous graphs demonstrate the superiority and robustness of HGCL over several state-of-the-art methods.
引用
收藏
页数:21
相关论文
共 50 条
[1]  
Chen T, 2020, PR MACH LEARN RES, V119
[2]  
Chen Yu, 2020, Advances in Neural Information Processing Systems, V33
[3]   Self-supervised Short-text Modeling through Auxiliary Context Generation [J].
Choudhary, Nurendra ;
Aggarwal, Charu C. ;
Subbian, Karthik ;
Reddy, Chandan K. .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (03)
[4]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[5]   metapath2vec: Scalable Representation Learning for Heterogeneous Networks [J].
Dong, Yuxiao ;
Chawla, Nitesh V. ;
Swami, Ananthram .
KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, :135-144
[6]   Metapath-guided Heterogeneous Graph Neural Network for Intent Recommendation [J].
Fan, Shaohua ;
Zhu, Junxiong ;
Han, Xiaotian ;
Shi, Chuan ;
Hu, Linmei ;
Ma, Biyu ;
Li, Yongliang .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :2478-2486
[7]   MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding [J].
Fu, Xinyu ;
Zhang, Jiani ;
Men, Ziqiao ;
King, Irwin .
WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, :2331-2341
[8]  
Hamilton WL, 2017, ADV NEUR IN, V30
[9]   Momentum Contrast for Unsupervised Visual Representation Learning [J].
He, Kaiming ;
Fan, Haoqi ;
Wu, Yuxin ;
Xie, Saining ;
Girshick, Ross .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, :9726-9735
[10]   Adversarial Learning on Heterogeneous Information Networks [J].
Hu, Binbin ;
Fang, Yuan ;
Shi, Chuan .
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, :120-129