Improving the Consistency of Semantic Parsing in KBQA Through Knowledge Distillation

被引:1
作者
Zou, Jun [1 ]
Cao, Shulin [2 ]
Wan, Jing [1 ]
Hou, Lei [2 ]
Xu, Jianjun [3 ]
机构
[1] Beijing Univ Chem Technol, Beijing 100029, Peoples R China
[2] Tsinghua Univ, Beijing 100084, Peoples R China
[3] Beijing Caizhi Technol Co Ltd, Beijing 100081, Peoples R China
来源
WEB AND BIG DATA, PT III, APWEB-WAIM 2023 | 2024年 / 14333卷
基金
国家重点研发计划;
关键词
knowledge distillation; semantic parsing; consistency; KoPL; KBQA;
D O I
10.1007/978-981-97-2387-4_25
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge base question answering (KBQA) is an important task that involves analyzing natural language questions and retrieving relevant answers from a knowledge base. To achieve this, Semantic Parsing (SP) is used to parse the question into a structured logical form, which is then executed to obtain the answer. Although different logical forms have unique advantages, existing methods only focus on a single logical form and do not consider the semantic consistency between different logical forms. In this paper, we address the issue of consistency in semantic parsing, which has not been explored before. We show that improving the semantic consistency between multiple logical forms can help increase the parsing performance. To address the consistency problem, we present a dynamic knowledge distillation framework for semantic parsing (DKD-SP). Our framework enables one logical form to learn some useful hidden knowledge from another, which improves the semantic consistency of different logical forms. Additionally, it dynamically adjusts the supervised weight of the hidden knowledge as the student model's ability changes. We evaluate our approach on the KQA Pro dataset, and our experimental results confirm its effectiveness. Our method improves the overall accuracy of the seven types of questions by 0.57%, with notable improvements in the accuracy of Qualifier, Compare, and Count questions. Furthermore, in the compositional generalization scenario, the overall accuracy improved by 4.02%. Our codes are publicly available on https://github.com/zjtfo/SP Consistency By KD.
引用
收藏
页码:373 / 388
页数:16
相关论文
共 36 条
  • [1] Question Answering Over Knowledge Graphs: A Case Study in Tourism
    Aghaei, Sareh
    Raad, Elie
    Fensel, Anna
    [J]. IEEE ACCESS, 2022, 10 : 69788 - 69801
  • [2] Agrawal P, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P4801
  • [3] Aguilar G, 2020, AAAI CONF ARTIF INTE, V34, P7350
  • [4] Ben Veyseh AP, 2020, AAAI CONF ARTIF INTE, V34, P9098
  • [5] Cao R, 2020, P 58 ANN M ASS COMP, P6806, DOI [10.18653/v1/2020.acl-main.608, DOI 10.18653/V1/2020.ACL-MAIN.608]
  • [6] Cao SL, 2022, PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), P6101
  • [7] Dey R, 2017, MIDWEST SYMP CIRCUIT, P1597, DOI 10.1109/MWSCAS.2017.8053243
  • [8] Dharur S, 2021, 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), P3103
  • [9] Dong L, 2016, PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, P33
  • [10] Knowledge Distillation: A Survey
    Gou, Jianping
    Yu, Baosheng
    Maybank, Stephen J.
    Tao, Dacheng
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (06) : 1789 - 1819