Natural Language Generation Using Sequential Models: A Survey

被引:12
作者
Pandey, Abhishek Kumar [1 ]
Roy, Sanjiban Sekhar [1 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Vellore 632014, Vellore, India
关键词
Natural language processing; Long term short-term memory; Natural language generation; Recurrent neural network; Sequential generative model; Story generation; TEXT GENERATION;
D O I
10.1007/s11063-023-11281-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Natural Language Generation (NLG) is one of the most critical yet challenging tasks in all Natural Language Processing applications. It is a process to automate text generation so that humans can understand its meaning. A handful of research articles published in the literature have described how NLG can produce understandable texts in various languages. The use of sequence-to-sequence modeling powered by deep learning techniques such as Long Term Short Term Memory, Recurrent Neural Networks, and Gated Recurrent Units has received much popularity as text generators. This survey provides a comprehensive overview of text generations and their related techniques, such as statistical, traditional, and neural network-based techniques. Generating text using the sequence-to-sequence model is not a simple task as it needs to handle continuous data, such as images, and discrete information, such as text. Therefore, in this study, we have identified some crucial areas for further research on text generation, such as incorporating a large text dataset, identifying and resolving grammatical errors, and generating extensive sentences or paragraphs. This work has also presented a detailed overview of the activation functions used in deep learning-based models and the evaluation metrics used for text generation.
引用
收藏
页码:7709 / 7742
页数:34
相关论文
共 73 条
[1]  
Abujar S., 2019, INT CONF COMPUT, DOI [10.1109/ICCCNT45670.2019.8944784, DOI 10.1109/icccnt45670.2019.8944784]
[2]  
[Anonymous], 2016, INT J COMPUT SYST EN, DOI DOI 10.1504/IJCSYSE.2016.079000
[3]  
[Anonymous], 2010, Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing, EMNLP '10
[4]   Reinforced Zero-Shot Cross-Lingual Neural Headline Generation [J].
Ayana ;
Chen, Yun ;
Yang, Cheng ;
Liu, Zhiyuan ;
Sun, Maosong .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 :2572-2584
[5]   Zero-Shot Cross-Lingual Neural Headline Generation [J].
Ayana ;
Shen, Shi-qi ;
Chen, Yun ;
Yang, Cheng ;
Liu, Zhi-yuan ;
Sun, Mao-song .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2018, 26 (12) :2319-2327
[6]   Text Generation From Tables [J].
Bao, Junwei ;
Tang, Duyu ;
Duan, Nan ;
Yan, Zhao ;
Zhou, Ming ;
Zhao, Tiejun .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2019, 27 (02) :311-320
[7]   Dilated Deep Neural Network for Segmentation of Retinal Blood Vessels in Fundus Images [J].
Biswas, Raj ;
Vasan, Ashwin ;
Roy, Sanjiban Sekhar .
IRANIAN JOURNAL OF SCIENCE AND TECHNOLOGY-TRANSACTIONS OF ELECTRICAL ENGINEERING, 2020, 44 (01) :505-518
[8]  
Bouchard G., 2007, NIPS
[9]   Gate control of mechanical itch by a subpopulation of spinal cord interneurons [J].
Bourane, Steeve ;
Duan, Bo ;
Koch, Stephanie C. ;
Dalet, Antoine ;
Britz, Olivier ;
Garcia-Campmany, Lidia ;
Kim, Euiseok ;
Cheng, Longzhen ;
Ghosh, Anirvan ;
Ma, Qiufu ;
Goulding, Martyn .
SCIENCE, 2015, 350 (6260) :550-554
[10]  
Browne Anthony, 1981, Hansel and Gretel