Multi-View Contrastive Enhanced Heterogeneous Graph Structure Learning

被引:0
作者
Bing R. [1 ]
Yuan G. [1 ,2 ]
Meng F. [1 ]
Wang S. [3 ]
Qiao S. [4 ]
Wang Z. [1 ]
机构
[1] School of Computer Science and Technology, China University of Mining and Technology, Xuzhou
[2] Mine digitalization Engineering Research Center, the Ministry of Education, Xuzhou
[3] School of Computer Science and Engineering, Central South University, Changsha
[4] School of Software Engineering, Chengdu University of Information Technology, Chengdu
来源
Ruan Jian Xue Bao/Journal of Software | 2023年 / 34卷 / 10期
关键词
graph contrastive learning; graph neural network; graph structure learning; heterogeneous graph; self-supervised learning;
D O I
10.13328/j.cnki.jos.006883
中图分类号
学科分类号
摘要
As a heterogeneous graph representation learning method, heterogeneous graph neural networks can effectively extract complex structural and semantic information from heterogeneous graphs, and have achieved excellent performance in node classification and connection prediction tasks, which provides strong support for the representation and analysis of knowledge graphs. Due to the existence of some noise interaction or missing interaction in the heterogeneous graph, the heterogeneous graph neural network incorporates erroneous neighbor features when nodes are aggregated and updated, thus affecting the overall performance of the model. In order to solve the above problems, this paper proposes a heterogeneous graph structure learning model enhanced by multi-view contrastive. Firstly, the semantic information in the heterogeneous graph is maintained by using the meta path, and the similarity graph is generated by calculating the feature similarity between the nodes under each meta-path, which is fused with the meta-path graph to optimize the graph structure. By comparing the similarity graph and meta-path graph as different views, the graph structure is optimized without the supervision information, and the dependence on the supervision signal is eliminated. Finally, in order to solve the problem that the learning ability of neural network model is insufficient at the initial stage of training and there are often error interactions in the generated graph structure, this paper designs a progressive graph structure fusion method. Through incremental weighted addition of meta-path graph and similarity graph, we change the weight of similarity graph in the fusion process, it not only prevents erroneous interactions from being introduced in the initial stage of training, but also achieves the purpose of using the interaction in similarity graph to suppress interference interaction or complete missing interaction, thus the structure of heterogeneous graph is optimized. We select node classification and node clustering as the verification tasks of graph structure learning. The experimental results on four real heterogeneous graph datasets prove that the heterogeneous graph structure learning method proposed in this paper is feasible and effective. Compared with the optimal comparison model, the performance of our model has been significantly improved under two evaluation metrics. © 2023 Chinese Academy of Sciences. All rights reserved.
引用
收藏
相关论文
共 49 条
[11]  
Hu Z, Dong Y, Wang K, Sun Y., Heterogeneous graph transformer, Proc. of the 29th Web Conference, pp. 2704-2710, (2020)
[12]  
Jin W, Li Y, Xu H, Wang Y, Ji S, Aggarwal C, Tang J., Adversarial attacks and defenses on graphs, ACM SIGKDD Explorations Newsletter, 22, 2, pp. 19-34, (2021)
[13]  
Wu B, Li J, Hou C, Fu G, Bian Y, Chen L, Huang J., Recent advances in reliable deep graph learning: Adversarial attack, inherent noise, and distribution shift
[14]  
Li R, Wang S, Zhu F, Huang J., Adaptive graph convolutional neural networks, Proc. of the 32rd Conference on Artificial Intelligence, pp. 3546-3553, (2018)
[15]  
Jiang B, Zhang Z, Lin D, Tang J, Luo B., Semi-supervised learning with graph learning-convolutional networks, Proc. of the 20th IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11313-11320, (2019)
[16]  
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, Polosukhin I., Attention is all you need, Proc. of the 30th Int’l Conf. on Neural Information Processing Systems, pp. 6000-6010, (2017)
[17]  
Zhou S, Bu J, Wang X, Chen J, Wang C., HAHE: Hierarchical attentive heterogeneous information network embedding
[18]  
Yun S, Jeong M, Kim R, Kang J, Kim H J., Graph transformer networks, Proc. of the 32rd Int’l Conf. on Neural Information Processing Systems, pp. 11983-11993, (2019)
[19]  
Hong H, Guo H, Lin Y, Yang X, Li Z, Ye J., An attention-based graph neural network for heterogeneous structural learning, Proc. of the 34th Conference on Artificial Intelligence, pp. 4132-4139, (2020)
[20]  
Yang Y, Guan Z, Li J, Zhao W, Cui J, Wang Q., Interpretable and efficient heterogeneous graph convolutional network, IEEE Trans. on Knowledge and Data Engineering, (2021)