Abstractive Summarization Model with a Feature-Enhanced Seq2Seq Structure

被引:0
作者
Hao, Zepeng [1 ]
Ji, Jingzhou [1 ]
Xie, Tao [1 ]
Xue, Bin [1 ]
机构
[1] Natl Univ Def Technol, Sch Informat & Commun, Xian, Peoples R China
来源
2020 5TH ASIA-PACIFIC CONFERENCE ON INTELLIGENT ROBOT SYSTEMS (ACIRS 2020) | 2020年
关键词
abstractive summarization; feature-enhanced Seq2Seq structure; memory network; non-local network;
D O I
10.1109/acirs49895.2020.9162627
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Abstractive text summarization task is mainly through deep learning method to summarize one or more documents to produce a concise summary that can express the main meaning of the document. Most methods are mainly based on the traditional Seq2Seq structure, but the traditional Seq2Seq structure has limited ability to capture and store long-term features and global features, resulting in a lack of information in the generated summary. In our paper, we put forward a new abstractive summarization model based on feature-enhanced Seq2Seq structure for single document summarization task. This model utilizes two types of feature capture networks to improve the encoder and decoder in traditional Seq2Seq structure, to enhance the model's ability to capture and store long-term features and global features, so that the generated summary more informative and more fluency. Finally, we verified the model we proposed on the CNN/DailyMail dataset. Experimental results demonstrate that the model proposed in this paper is more effective than the baseline model, and has improved by 5.6%, 5.3%, 6.2% on the three metrics R-1, R-2, and R-L.
引用
收藏
页码:163 / 167
页数:5
相关论文
共 11 条
  • [1] Hermann KM, 2015, ADV NEUR IN, V28
  • [2] Jiang X, 2018, IMPROVING POINTER GE
  • [3] Kim S, DR IMPROVING ABSTRAC
  • [4] Moratanch N, 2017, 2017 INTERNATIONAL CONFERENCE ON COMPUTER, COMMUNICATION AND SIGNAL PROCESSING (ICCCSP), P265
  • [5] Moratanch N, 2016, PROCEEDINGS OF IEEE INTERNATIONAL CONFERENCE ON CIRCUIT, POWER AND COMPUTING TECHNOLOGIES (ICCPCT 2016)
  • [6] Nallapati Ramesh, 2016, CONLL 2016
  • [7] Rush AlexanderM., 2015, P C EMP METH NAT LAN, DOI 10.18653/v1/D15-1044
  • [8] Get To The Point: Summarization with Pointer-Generator Networks
    See, Abigail
    Liu, Peter J.
    Manning, Christopher D.
    [J]. PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1073 - 1083
  • [9] Sukhbaatar S, 2015, ADV NEUR IN, V28
  • [10] Vaswani A, 2017, ADV NEUR IN, V30