Knowledge Graph Completion Method of Combining Structural Information with Semantic Information

被引:0
作者
Hu, Binhao [1 ]
Zhang, Jianpeng [2 ]
Chen, Hongchang [2 ]
机构
[1] Zhengzhou Univ, Sch Cyber Sci & Engn, Zhengzhou 450002, Peoples R China
[2] Natl Digital Switching Syst Engn & Technol R&D Ctr, Zhengzhou 450002, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Training; Semantics; Knowledge graphs; Predictive models; Vectors; Robustness; Explosions; Standards; Testing; Context modeling; Knowledge graph; Knowledge graph completion; Representation learning;
D O I
10.23919/cje.2022.00.299
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the development of knowledge graphs, a series of applications based on knowledge graphs have emerged. The incompleteness of knowledge graphs makes the effect of the downstream applications affected by the quality of the knowledge graphs. To improve the quality of knowledge graphs, translation-based graph embeddings such as TransE, learn structural information by representing triples as low-dimensional dense vectors. However, it is difficult to generalize to the unseen entities that are not observed during training but appear during testing. Other methods use the powerful representational ability of pre-trained language models to learn entity descriptions and contextual representation of triples. Although they are robust to incompleteness, they need to calculate the score of all candidate entities for each triple during inference. We consider combining two models to enhance the robustness of unseen entities by semantic information, and prevent combined explosion by reducing inference overhead through structured information. We use a pre-training language model to code triples and learn the semantic information within them, and use a hyperbolic space-based distance model to learn structural information, then integrate the two types of information together. We evaluate our model by performing link prediction experiments on standard datasets. The experimental results show that our model achieves better performances than state-of-the-art methods on two standard datasets.
引用
收藏
页码:1412 / 1420
页数:9
相关论文
共 27 条
[1]   Tree-like structure in large social and information networks [J].
Adcock, Aaron B. ;
Sullivan, Blair D. ;
Mahoney, Michael W. .
2013 IEEE 13TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2013, :1-10
[2]   DBpedia: A nucleus for a web of open data [J].
Auer, Soeren ;
Bizer, Christian ;
Kobilarov, Georgi ;
Lehmann, Jens ;
Cyganiak, Richard ;
Ives, Zachary .
SEMANTIC WEB, PROCEEDINGS, 2007, 4825 :722-+
[3]  
Balazevic I, 2019, ADV NEUR IN, V32
[4]  
Bordes A., 2013, Advances in Neural Information Processing Systems, V26
[5]   Geometric Deep Learning Going beyond Euclidean data [J].
Bronstein, Michael M. ;
Bruna, Joan ;
LeCun, Yann ;
Szlam, Arthur ;
Vandergheynst, Pierre .
IEEE SIGNAL PROCESSING MAGAZINE, 2017, 34 (04) :18-42
[6]  
Chami Ines, 2020, ACL, DOI 10.18653/v1/2020.acl-main.617
[7]  
Dettmers T, 2018, AAAI CONF ARTIF INTE, P1811
[8]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[9]  
Fellbaum C, 1998, LANG SPEECH & COMMUN, P1
[10]  
Ganea OE, 2018, ADV NEUR IN, V31