Improving Entity Linking by Modeling Latent Entity Type Information

被引:0
作者
Chen, Shuang [1 ]
Wang, Jinpeng [2 ]
Jiang, Feng [1 ,3 ]
Lin, Chin-Yew [2 ]
机构
[1] Harbin Inst Technol, Harbin, Peoples R China
[2] Microsoft Res Asia, Beijing, Peoples R China
[3] Peng Cheng Lab, Shenzhen, Peoples R China
来源
THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2020年 / 34卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing state of the art neural entity linking models employ attention-based bag-of-words context model and pre-trained entity embeddings bootstrapped from word embeddings to assess topic level context compatibility. However, the latent entity type information in the immediate context of the mention is neglected, which causes the models often link mentions to incorrect entities with incorrect type. To tackle this problem, we propose to inject latent entity type information into the entity embeddings based on pre-trained BERT. In addition, we integrate a BERT-based entity similarity score into the local context model of a state-of-the-art model to better capture latent entity type information. Our model significantly outperforms the state-of-the-art entity linking models on standard benchmark (AIDA-CoNLL). Detailed experiment analysis demonstrates that our model corrects most of the type errors produced by the direct baseline.
引用
收藏
页码:7529 / 7537
页数:9
相关论文
共 35 条
[1]  
[Anonymous], 2015, Transactions of the Association for Computational Linguistics, DOI DOI 10.1162/TACL_A_00154
[2]  
[Anonymous], 2018, ACL
[3]  
[Anonymous], 2011, EMNLP
[4]  
[Anonymous], 2015, Trans. Assoc. Comput. Linguistics
[5]  
[Anonymous], 2015, ACL
[6]  
Cheng X., 2013, P 2013 C EMP METH NA
[7]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[8]  
Durrett G., 2014, TACL
[9]  
Dyer C, 2015, PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, P334
[10]  
Ganea O.-E., 2017, EMNLP