KGAnet: a knowledge graph attention network for enhancing natural language inference

被引:12
作者
Song, Meina [1 ]
Zhao, Wen [1 ]
HaiHong, E. [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Natural language processing; Natural language inference; External knowledge;
D O I
10.1007/s00521-020-04851-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Natural language inference (NLI) is the basic task of many applications such as question answering and paraphrase recognition. Existing methods have solved the key issue of how the NLI model can benefit from external knowledge. Inspired by this, we attempt to further explore the following two problems: (1) how to make better use of external knowledge when the total amount of such knowledge is constant and (2) how to bring external knowledge to the NLI model more conveniently in the application scenario. In this paper, we propose a novel joint training framework that consists of a modified graph attention network, called the knowledge graph attention network, and an NLI model. We demonstrate that the proposed method outperforms the existing method which introduces external knowledge, and we improve the performance of multiple NLI models without additional external knowledge.
引用
收藏
页码:14963 / 14973
页数:11
相关论文
共 43 条
  • [1] [Anonymous], 2018, P C N AM CHAPT ASS C, DOI DOI 10.18653/V1/N18-1029
  • [2] [Anonymous], 2017, ARXIV170507664
  • [3] [Anonymous], 2015, ARXIV151106361
  • [4] [Anonymous], 2017, ARXIV PREPRINT ARXIV
  • [5] [Anonymous], 2018, ARXIV180808762
  • [6] Atwood J., 2016, ADV NEURAL INFORM PR, P1993, DOI [DOI 10.48550/ARXIV.1511.02136, DOI 10.5555/3157096.3157320]
  • [7] Bordes A., 2013, P ANN C NEUR INF PRO, V26, P1, DOI DOI 10.5555/2999792.2999923
  • [8] Bowman S.R., 2015, EMNLP
  • [9] Bromley J., 1993, International Journal of Pattern Recognition and Artificial Intelligence, V7, P669, DOI 10.1142/S0218001493000339
  • [10] Bruna J., 2014, P 2 INT C LEARN REPR