GS-InGAT: An interaction graph attention network with global semantic for knowledge graph completion

被引:17
作者
Yin, Hong [1 ]
Zhong, Jiang [1 ]
Wang, Chen [1 ]
Li, Rongzhen [1 ]
Li, Xue [2 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[2] Univ Queensland, Informat Technol & Elect Engn, Brisbane, Qld 4072, Australia
关键词
Knowledge graph completion; Global semantic; Semantic graph; Interaction information; Graph attention network;
D O I
10.1016/j.eswa.2023.120380
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge graph completion (KGC) aims to infer missing links between entities based on the observed ones. Current KGC methods primarily focus on KG embedding models, which project entities and relations as low -dimensional vectors. Recently, the combination of textual information with graph neural network models has drawn extensive attention due to their superiority in utilizing topological structures, benefiting from the message passing mechanism, and their effectiveness in supplementing structural information. Nevertheless, previous methods suffer from the following two limitations. First, they always treat the textual information as an independent instance to enhance the corresponding entities, without considering the global semantic within the KG. Second, Graph Neural Networks (GNNs) typically assume that the neighbors of a node are independent of each other, ignoring the possible interactions between them. To eliminate these limitations, we creatively propose a KGC method called GS-InGAT (Interaction Graph ATtention Network with Global Semantic). Concretely, we utilize a semantic graph to model the semantic relationships and obtain the global semantic representations for entities based on it. Furthermore, we introduce an efficient Interaction Graph ATtention network (InGAT) that can simultaneously capture both the interaction and local information of entities, which can be fused to generate structural representations. Finally, we feed the combination of the semantic and structural representations, along with relation representations, into the decoder to score triples. Experimental results demonstrate that the GS-InGAT consistently attains comparable performance on benchmark datasets, verifying the effectiveness of considering the global semantic and interactions between neighbors.
引用
收藏
页数:15
相关论文
共 70 条
[1]  
An B., 2018, NAACL, P745
[2]   Hypernetwork Knowledge Graph Embeddings [J].
Balazevic, Ivana ;
Allen, Carl ;
Hospedales, Timothy M. .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS, 2019, 11731 :553-565
[3]   Robust Spectral Clustering for Noisy Data Modeling Sparse Corruptions Improves Latent Embeddings [J].
Bojchevski, Aleksandar ;
Matkovic, Yves ;
Guennemann, Stephan .
KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, :737-746
[4]  
Bordes A., 2013, P 26 INT C NEURAL IN, P2787
[5]   TransGCN:Coupling Transformation Assumptions with Graph Convolutional Networks for Link Prediction [J].
Cai, Ling ;
Yan, Bo ;
Mai, Gengchen ;
Janowicz, Krzysztof ;
Zhu, Rui .
PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE ON KNOWLEDGE CAPTURE (K-CAP '19), 2019, :131-138
[6]   Unifying Knowledge Graph Learning and Recommendation: Towards a Better Understanding of User Preferences [J].
Cao, Yixin ;
Wang, Xiang ;
He, Xiangnan ;
Hu, Zikun ;
Chua, Tat-Seng .
WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, :151-161
[7]  
Chen Y., 2020, Advances in Neural Information Processing Systems, V33, P18194
[8]  
[Clark Kevin ELECTRA ELECTRA], 2020, arXiv, DOI [DOI 10.48550/arXiv.2003.10555, DOI 10.48550/ARXIV.2003.10555, 10.48550/arXiv.2003.10555]
[9]  
Cui WY, 2019, Arxiv, DOI arXiv:1903.02419
[10]  
Dettmers T, 2018, AAAI CONF ARTIF INTE, P1811