A Survey of Text Summarization Approaches Based on Deep Learning

被引:10
作者
Hou, Sheng-Luan [1 ,2 ]
Huang, Xi-Kun [2 ,3 ,4 ]
Fei, Chao-Qun [1 ,2 ]
Zhang, Shu-Han [1 ,2 ]
Li, Yang-Yang [3 ,4 ]
Sun, Qi-Lin [2 ,3 ,4 ]
Wang, Chuan-Qing [2 ,3 ,4 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] Chinese Acad Sci, Acad Math & Syst Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Management Decis & Informat Syst, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
automatic text summarization; artificial intelligence; deep learning; attentional encoder-decoder; natural language processing;
D O I
10.1007/s11390-020-0207-x
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic text summarization (ATS) has achieved impressive performance thanks to recent advances in deep learning (DL) and the availability of large-scale corpora. The key points in ATS are to estimate the salience of information and to generate coherent results. Recently, a variety of DL-based approaches have been developed for better considering these two aspects. However, there is still a lack of comprehensive literature review for DL-based ATS approaches. The aim of this paper is to comprehensively review significant DL-based approaches that have been proposed in the literature with respect to the notion of generic ATS tasks and provide a walk-through of their evolution. We first give an overview of ATS and DL. The comparisons of the datasets are also given, which are commonly used for model training, validation, and evaluation. Then we summarize single-document summarization approaches. After that, an overview of multi-document summarization approaches is given. We further analyze the performance of the popular ATS models on common datasets. Various popular approaches can be employed for different ATS tasks. Finally, we propose potential research directions in this fast-growing field. We hope this exploration can provide new insights into future research of DL-based ATS.
引用
收藏
页码:633 / 663
页数:31
相关论文
共 124 条
[1]  
Amplayo Reinald Kim, 2018, Long Papers, V1, P697
[2]  
[Anonymous], 2012, P 29 INT C MACH LEAR
[3]  
[Anonymous], 2019, INT C MACH LEARN
[4]  
[Anonymous], 2018, INT C LEARNING REPRE
[5]  
[Anonymous], 2016, P 2016 C EMP METH NA
[6]  
[Anonymous], 2015, CONF EMP METH NAT LA
[7]  
Bahdanau D., 2014, ARXIV PREPRINT ARXIV
[8]  
Berg-Kirkpatrick Taylor, 2011, Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies-Volume, P481
[9]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[10]   Improving Transformer with Sequential Context Representations for Abstractive Text Summarization [J].
Cai, Tian ;
Shen, Mengjun ;
Peng, Huailiang ;
Jiang, Lei ;
Dai, Qiong .
NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 :512-524