Generating knowledge aware explanation for natural language inference

被引:8
作者
Yang, Zongbao
Xu, Yinxin
Hu, Jinlong
Dong, Shoubin [1 ]
机构
[1] South China Univ Technol, 381 Wushan Rd, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural language inference; Knowledge graph; Explainability of model; Natural language explanation BEC;
D O I
10.1016/j.ipm.2022.103245
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Natural language inference (NLI) is an increasingly important task of natural language processing, and the explainable NLI generates natural language explanations (NLEs) in addition to label prediction, to make NLI explainable and acceptable. However, NLEs generated by current models often present problems that disobey of commonsense or lack of informativeness. In this paper, we propose a knowledge enhanced explainable NLI framework (KxNLI) by leveraging Knowledge Graph (KG) to address these problems. The subgraphs from KG are constructed based on the concept set of the input sequence. Contextual embedding of input and the graph embedding of subgraphs, is used to guide the NLE generation by using a copy mechanism. Furthermore, the generated NLEs are used to augment the original data. Experimental results show that the performance of KxNLI can achieve state-of-the-art (SOTA) results on the SNLI dataset when the pretrained model is fine-tuned on the augmented data. Besides, the proposed mechanism of knowledge enhancement and rationales utilization can achieve ideal performance on vanilla seq2seq model, and obtain better transfer ability when transferred to the MultiNLI dataset. In order to comprehensively evaluate generated NLEs, we design two metrics from the perspectives of the accuracy and informativeness, to measure the quality of NLEs, respectively. The results show that KxNLI can provide high quality NLEs while making accurate prediction.
引用
收藏
页数:15
相关论文
共 72 条
  • [21] Feng YL, 2020, PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), P1295
  • [22] Ghaeini R., 2018, P C N AM ASS CHAPT A, V1, P1460, DOI [DOI 10.18653/V1/N18-1132, 10.18653/v1/N18-1132]
  • [23] Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1
  • [24] Survey of Low-Resource Machine Translation
    Haddow, Barry
    Bawden, Rachel
    Barone, Antonio Valerio Miceli
    Helcl, Jindrich
    Birch, Alexandra
    [J]. COMPUTATIONAL LINGUISTICS, 2022, 48 (03) : 673 - 732
  • [25] Accuracy-diversity trade-off in recommender systems via graph convolutions
    Isufi, Elvin
    Pocchiari, Matteo
    Hanjalic, Alan
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2021, 58 (02)
  • [26] Ji H, 2020, PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), P725
  • [27] Kim B, 2018, PR MACH LEARN RES, V80
  • [28] Explaining Text Matching on Neural Natural Language Inference
    Kim, Youngwoo
    Jang, Myungha
    Allan, James
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2020, 38 (04)
  • [29] Kipf TN, 2016, ARXIV
  • [30] Kumar Sawan, 2020, P 58 ANN M ASS COMP, P8730, DOI [10.18653/v1/2020.acl-main.771, DOI 10.18653/V1/2020.ACL-MAIN.771]