HACAN: a hierarchical answer-aware and context-aware network for question generation

被引:2
作者
Sun, Ruijun [1 ]
Tao, Hanqin [1 ]
Chen, Yanmin [1 ]
Liu, Qi [1 ]
机构
[1] Univ Sci & Technol China, Anhui Prov Key Lab Big Data Anal & Applicat, Hefei 230027, Peoples R China
基金
国家重点研发计划;
关键词
question generation; natural language generation; natural language processing; sequence to sequence;
D O I
10.1007/s11704-023-2246-2
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Question Generation (QG) is the task of generating questions according to the given contexts. Most of the existing methods are based on Recurrent Neural Networks (RNNs) for generating questions with passage-level input for providing more details, which seriously suffer from such problems as gradient vanishing and ineffective information utilization. In fact, reasonably extracting useful information from a given context is more in line with our actual needs during questioning especially in the education scenario. To that end, in this paper, we propose a novel Hierarchical Answer-Aware and Context-Aware Network (HACAN) to construct a high-quality passage representation and judge the balance between the sentences and the whole passage. Specifically, a Hierarchical Passage Encoder (HPE) is proposed to construct an answer-aware and context-aware passage representation, with a strategy of utilizing multi-hop reasoning. Then, we draw inspiration from the actual human questioning process and design a Hierarchical Passage-aware Decoder (HPD) which determines when to utilize the passage information. We conduct extensive experiments on the SQuAD dataset, where the results verify the effectiveness of our model in comparison with several baselines.
引用
收藏
页数:11
相关论文
共 47 条
[1]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[2]   A neural probabilistic language model [J].
Bengio, Y ;
Ducharme, R ;
Vincent, P ;
Jauvin, C .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (06) :1137-1155
[3]  
Brown TB, 2020, ADV NEUR IN, V33
[4]  
Cho Kyunghyun, 2014, Empirical evaluation of gated recurrent neural networks on sequence modeling, P103, DOI 10.3115/v1/w14-4012
[5]  
Denkowski Michael, 2014, WORKSH STAT MACH TRA, DOI DOI 10.3115/V1/W14-3348
[6]   Learning to Ask: Neural Question Generation for Reading Comprehension [J].
Du, Xinya ;
Shao, Junru ;
Cardie, Claire .
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, :1342-1352
[7]   Video Captioning With Attention-Based LSTM and Semantic Consistency [J].
Gao, Lianli ;
Guo, Zhao ;
Zhang, Hanwang ;
Xu, Xing ;
Shen, Heng Tao .
IEEE TRANSACTIONS ON MULTIMEDIA, 2017, 19 (09) :2045-2055
[8]  
Gong YC, 2018, MACHINE READING FOR QUESTION ANSWERING, P1
[9]  
Goodfellow IJ, 2014, ADV NEUR IN, V27, P2672
[10]   Survey of Low-Resource Machine Translation [J].
Haddow, Barry ;
Bawden, Rachel ;
Barone, Antonio Valerio Miceli ;
Helcl, Jindrich ;
Birch, Alexandra .
COMPUTATIONAL LINGUISTICS, 2022, 48 (03) :673-732