A Simple but Effective Method to Incorporate Multi-turn Context with BERT for Conversational Machine Comprehension

被引:0
作者
Ohsugi, Yasuhito [1 ]
Saito, Itsumi [1 ]
Nishida, Kyosuke [1 ]
Asano, Hisako [1 ]
Tomita, Junji [1 ]
机构
[1] NTT Corp, NTT Media Intelligence Labs, Tokyo, Japan
来源
NLP FOR CONVERSATIONAL AI | 2019年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conversational machine comprehension (CMC) requires understanding the context of multi-turn dialogue. Using BERT, a pre-training language model, has been successful for single-turn machine comprehension, while modeling multiple turns of question answering with BERT has not been established because BERT has a limit on the number and the length of input sequences. In this paper, we propose a simple but effective method with BERT for CMC. Our method uses BERT to encode a paragraph independently conditioned with each question and each answer in a multi-turn context. Then, the method predicts an answer on the basis of the paragraph representations encoded with BERT. The experiments with representative CMC datasets, QuAC and CoQA, show that our method outperformed recently published methods (+0.8 F1 on QuAC and +2.1 F1 on CoQA). In addition, we conducted a detailed analysis of the effects of the number and types of dialogue history on the accuracy of CMC, and we found that the gold answer history, which may not be given in an actual conversation, contributed to the model performance most on both datasets.
引用
收藏
页码:11 / 17
页数:7
相关论文
共 14 条
[1]  
[Anonymous], 2018, INT C LEARN REPR
[2]  
[Anonymous], 2018, CORR
[3]  
[Anonymous], CORR
[4]  
[Anonymous], 2016, CORR
[5]   Reading Wikipedia to Answer Open-Domain Questions [J].
Chen, Danqi ;
Fisch, Adam ;
Weston, Jason ;
Bordes, Antoine .
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, :1870-1879
[6]  
Choi E, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P2174
[7]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[8]  
Huang Hsin-Yuan, 2019, ICLR
[9]  
Kingma D.P., 2014, INT C LEARN REPR ICL
[10]  
Lewis Mike, 2019, INT C LEARN REPR