Bridging the KB-Text Gap: Leveraging Structured Knowledge-aware Pre-training for KBQA

被引:2
作者
Dong, Guanting [1 ]
Li, Rumei [2 ]
Wang, Sirui [2 ]
Zhang, Yupeng [3 ]
Xian, Yunsen [2 ]
Xu, Weiran [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing, Peoples R China
[2] Meituan Grp, Beijing, Peoples R China
[3] Beijing Univ Aeronaut & Astronaut, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023 | 2023年
关键词
KBQA; Structured Knowledge; Pre-training; Efficient Linearization;
D O I
10.1145/3583780.3615150
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge Base Question Answering (KBQA) aims to answer natural language questions with factual information such as entities and relations in KBs. However, traditional Pre-trained Language Models (PLMs) are directly pre-trained on large-scale natural language corpus, which poses challenges for them in understanding and representing complex subgraphs in structured KBs. To bridge the gap between texts and structured KBs, we propose a Structured Knowledge-aware Pre-training method (SKP). In the pre-training stage, we introduce two novel structured knowledge-aware tasks, guiding the model to effectively learn the implicit relationship and better representations of complex subgraphs. In downstream KBQA task, we further design an efficient linearization strategy and an interval attention mechanism, which assist the model to better encode complex subgraphs and shield the interference of irrelevant subgraphs during reasoning respectively. Detailed experiments and analyses on WebQSP verify the effectiveness of SKP, especially the significant improvement in subgraph retrieval (+4.08% H@10).
引用
收藏
页码:3854 / 3859
页数:6
相关论文
共 45 条
  • [1] Agarwal O, 2021, 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), P3554
  • [2] Aitchison Laurence, 2021, ARXIV210702495
  • [3] DBpedia: A nucleus for a web of open data
    Auer, Soeren
    Bizer, Christian
    Kobilarov, Georgi
    Lehmann, Jens
    Cyganiak, Richard
    Ives, Zachary
    [J]. SEMANTIC WEB, PROCEEDINGS, 2007, 4825 : 722 - +
  • [4] Bollacker K., 2008, SIGMOD, P1247
  • [5] Bordes A., 2013, ADV NEURAL INFORM PR, V26, P2787, DOI DOI 10.5555/2999792.2999923
  • [6] Das R, 2022, PR MACH LEARN RES
  • [7] Das Rajarshi, 2021, ARXIV210408762
  • [8] de Jong Michiel, 2022, ARXIV221208153
  • [9] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
  • [10] Dong Guanting, 2023, ICASSP 2023 2023 IEE, P1, DOI [10.1109/ICASSP49357.2023.10095149, DOI 10.1109/ICASSP49357.2023.10095149]