Multi-Dimension Attention for Multi-Turn Dialog Generation (Student Abstract)

被引:0
作者
Belainine, Billal [1 ]
Sadat, Fatiha [1 ]
Boukadoum, Mounir [1 ]
机构
[1] Univ Quebec, 201 President Kennedy, Montreal, PQ H2X 3Y7, Canada
来源
THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2022年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a generative neural model for open and multi-turn dialog response generation that relies on a multi-dimension attention process to account for the semantic interdependence between the generated words and the conversational history, so as to identify all the words and utterances that influence each generated response. The performance of the model is evaluated on the wide scope DailyDialog corpus and a comparison is made with two other generative neural architectures, using machine learning metrics. The results show that the proposed model improves the state of the art for generation accuracy, and its multi-dimension attention allows for a more detailed tracking of the influential words and utterances in the dialog history for response explainability by the dialog history.
引用
收藏
页码:12909 / 12910
页数:2
相关论文
共 4 条
  • [1] Serban IV, 2016, AAAI CONF ARTIF INTE, P3776
  • [2] Vaswani A., 2017, NEURIPS, P5998
  • [3] Xing C., 2018, P AAAI, V32
  • [4] Zhang HN, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P3721