Zero-Shot Entity Linking by Reading Entity Descriptions

被引:0
作者
Logeswaran, Lajanugen [1 ,3 ]
Chang, Ming-Wei [2 ]
Lee, Kenton [2 ]
Toutanova, Kristina [2 ]
Devlin, Jacob [2 ]
Lee, Honglak [1 ,2 ]
机构
[1] Univ Michigan, Ann Arbor, MI 48109 USA
[2] Google Res, Mountain View, CA USA
[3] Google, Mountain View, CA USA
来源
57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019) | 2019年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present the zero-shot entity linking task, where mentions must be linked to unseen entities without in-domain labeled data. The goal is to enable robust transfer to highly specialized domains, and so no metadata or alias tables are assumed. In this setting, entities are only identified by text descriptions, and models must rely strictly on language understanding to resolve the new entities. First, we show that strong reading comprehension models pre-trained on large unlabeled data can be used to generalize to unseen entities. Second, we propose a simple and effective adaptive pre-training strategy, which we term domain-adaptive pre-training (DAP), to address the domain shift problem associated with linking unseen entities in a new domain. We present experiments on a new dataset that we construct for this task and show that DAP improves over strong pre-training baselines, including BERT. The data and code are available at https://github.com/lajanugen/zeshel.(1)
引用
收藏
页码:3449 / 3460
页数:12
相关论文
共 26 条
[1]  
[Anonymous], 2018, P 32 AAAI C ART INT
[2]  
[Anonymous], P 56 ANN M ASS COMP
[3]  
[Anonymous], 2018, ARXIV180504787
[4]  
[Anonymous], 2017, P 2017 C EMP METH NA
[5]  
[Anonymous], 2015, T ASS COMPUTATIONAL
[6]  
[Anonymous], P 56 ANN M ASS COMP
[7]  
[Anonymous], 11 C EUR CHAPT ASS C
[8]  
[Anonymous], P 2015 C N AM CHAPT
[9]  
[Anonymous], 2014, P 52 ANN M ASS COMP
[10]  
[Anonymous], 2017, P 2017 C EMP METH NA