Temporal Effects on Pre-trained Models for Language Processing Tasks

被引:25
作者
Agarwal, Oshin [1 ]
Nenkova, Ani [2 ]
机构
[1] Univ Penn, Philadelphia, PA 19104 USA
[2] Adobe Res, New York, NY USA
关键词
D O I
10.1162/tacl_a_00497
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Keeping the performance of language technologies optimal as time passes is of great practical interest. We study temporal effects on model performance on downstream language tasks, establishing a nuanced terminology for such discussion and identifying factors essential to conduct a robust study. We present experiments for several tasks in English where the label correctness is not dependent on time and demonstrate the importance of distinguishing between temporal model deterioration and temporal domain adaptation for systems using pre-trained representations. We find that, depending on the task, temporal model deterioration is not necessarily a concern. Temporal domain adaptation, however, is beneficial in all cases, with better performance for a given time period possible when the system is trained on temporally more recent data. Therefore, we also examine the efficacy of two approaches for temporal domain adaptation without human annotations on new data. Self-labeling shows consistent improvement and notably, for named entity recognition, leads to better temporal adaptation than even human annotations.
引用
收藏
页码:904 / 921
页数:18
相关论文
共 45 条
[1]  
Agarwal O, 2021, COMPUT LINGUIST, V47, P117, DOI [10.1162/COLI_a_00397, 10.1162/coli_a_00397]
[2]  
[Anonymous], 2013, P 22 INT C WORLD WID, DOI [DOI 10.1145/2488388.2488416, 10.1145/2488388.2488416]
[3]  
Bjerva Johannes., 2020, AAAI, DOI [10.1609/aaai.v34i05.6240, DOI 10.1609/AAAI.V34I05.6240]
[4]  
Brandl S, 2019, 1ST INTERNATIONAL WORKSHOP ON COMPUTATIONAL APPROACHES TO HISTORICAL LANGUAGE CHANGE, P146
[5]  
Chen Shuguang., 2021, P 9 INT WORKSHOP NAT, P163, DOI 10.18653/v1/2021.socialnlp-1.14
[6]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[7]  
Dhingra Bhuwan., 2021, Time-aware language models as temporal knowledge bases
[8]  
Dror R, 2018, PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, P1383
[9]  
Dury P., 2011, ICAME Journal, P19
[10]  
Eisenstein Jacob, 2013, P 2013 C N AM CHAPTE, P359