ReLMKG: reasoning with pre-trained language models and knowledge graphs for complex question answering

被引:23
作者
Cao, Xing [1 ,2 ]
Liu, Yun [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Sch Elect & Informat Engn, Beijing 100044, Peoples R China
[2] Beijing Municipal Commiss Educ, Key Lab Commun & Informat Syst, Beijing 100044, Peoples R China
关键词
Complex question answering; Pre-trained language model; Knowledge graph; Joint reasoning; WEB;
D O I
10.1007/s10489-022-04123-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The goal of complex question answering over knowledge bases (KBQA) is to find an answer entity in a knowledge graph. Recent information retrieval-based methods have focused on the topology of the knowledge graph, ignoring inconsistencies between knowledge graph embeddings and natural language embeddings, and cannot effectively utilize both implicit and explicit knowledge for reasoning. In this paper, we propose a novel model, ReLMKG, to address this challenge. This approach performs joint reasoning on a pre-trained language model and the associated knowledge graph. The complex question and textual paths are encoded by the language model, bridging the gap between the question and the knowledge graph and exploiting implicit knowledge without introducing additional unstructured text. The outputs of different layers in the language model are used as instructions to guide a graph neural network to perform message propagation and aggregation in a step-by-step manner, which utilizes the explicit knowledge contained in the structured knowledge graph. We analyse the reasoning ability of the ReLMKG model for knowledge graphs with different degrees of sparseness and evaluate the generalizability of the model. Experiments conducted on the Complex WebQuestions and WebQuestionsSP datasets demonstrate the effectiveness of our approach on KBQA tasks.
引用
收藏
页码:12032 / 12046
页数:15
相关论文
共 54 条
[1]   DBpedia: A nucleus for a web of open data [J].
Auer, Soeren ;
Bizer, Christian ;
Kobilarov, Georgi ;
Lehmann, Jens ;
Cyganiak, Richard ;
Ives, Zachary .
SEMANTIC WEB, PROCEEDINGS, 2007, 4825 :722-+
[2]  
Bengio Y., 2014, NIPS 2014 WORKSH DEE, DOI DOI 10.48550/ARXIV.1412.3555
[3]   Learning to Answer Complex Questions over Knowledge Bases with Query Composition [J].
Bhutani, Nikita ;
Zheng, Xinyi ;
Jagadish, H. V. .
PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, :739-748
[4]  
Bollacker KD., 2008, P ACM SIGMOD INT C M, P1247, DOI DOI 10.1145/1376616.1376746
[5]  
Bordes A., 2013, P 26 INT C NEUR INF, V2, P2787
[6]  
Cao X, 2021, COMPLEXITY
[7]   Coarse-grained decomposition and fine-grained interaction for multi-hop question answering [J].
Cao, Xing ;
Liu, Yun .
JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2022, 58 (01) :21-41
[8]  
Chen YR, 2020, PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P3751
[9]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[10]   The state of the art in open domain complex question answering: a survey [J].
Etezadi, Romina ;
Shamsfard, Mehrnoush .
APPLIED INTELLIGENCE, 2023, 53 (04) :4124-4144