Multihop Attention Networks for Question Answer Matching

被引:37
作者
Nam Khanh Tran [1 ]
Niederee, Claudia [1 ]
机构
[1] Leibniz Univ Hannover, Res Ctr L3S, Hannover, Germany
来源
ACM/SIGIR PROCEEDINGS 2018 | 2018年
关键词
Answer selection; non-factoid QA; representation learning; attention mechanism;
D O I
10.1145/3209978.3210009
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Attention based neural network models have been successfully applied in answer selection, which is an important subtask of question answering (QA). These models often represent a question by a single vector and find its corresponding matches by attending to candidate answers. However, questions and answers might be related to each other in complicated ways which cannot be captured by single-vector representations. In this paper, we propose Multihop Attention Networks (MAN) which aim to uncover these complex relations for ranking question and answer pairs. Unlike previous models, we do not collapse the question into a single vector, instead we use multiple vectors which focus on different parts of the question for its overall semantic representation and apply multiple steps of attention to learn representations for the candidate answers. For each attention step, in addition to common attention mechanisms, we adopt sequential attention which utilizes context information for computing context-aware attention weights. Via extensive experiments, we show that MAN outperforms state-of-the-art approaches on popular benchmark QA datasets. Empirical studies confirm the effectiveness of sequential attention over other attention mechanisms.
引用
收藏
页码:325 / 334
页数:10
相关论文
共 34 条
[1]  
[Anonymous], CORR
[2]  
[Anonymous], 2015, NIPS
[3]  
[Anonymous], 2015, P 3 INT C LEARN REPR
[4]  
[Anonymous], 2014, Advances in neural information processing systems
[5]  
[Anonymous], 2016, CORR
[6]  
[Anonymous], 2010, P 23 INT C COMPUTATI
[7]  
[Anonymous], 2015, Reasoning about entailment with neural attention
[8]  
[Anonymous], NIPS DEEP LEAR WORKS
[9]  
[Anonymous], 1997, Neural Computation
[10]  
[Anonymous], 2017, INT C LEARN REPR 201