Natural Language Generation Using Sequential Models: A Survey

被引:0
|
作者
Abhishek Kumar Pandey
Sanjiban Sekhar Roy
机构
[1] Vellore Institute of Technology,School of Computer Science and Engineering
[2] Vellore,undefined
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Natural language processing; Long term short-term memory; Natural language generation; Recurrent neural network; Sequential generative model; Story generation;
D O I
暂无
中图分类号
学科分类号
摘要
Natural Language Generation (NLG) is one of the most critical yet challenging tasks in all Natural Language Processing applications. It is a process to automate text generation so that humans can understand its meaning. A handful of research articles published in the literature have described how NLG can produce understandable texts in various languages. The use of sequence-to-sequence modeling powered by deep learning techniques such as Long Term Short Term Memory, Recurrent Neural Networks, and Gated Recurrent Units has received much popularity as text generators. This survey provides a comprehensive overview of text generations and their related techniques, such as statistical, traditional, and neural network-based techniques. Generating text using the sequence-to-sequence model is not a simple task as it needs to handle continuous data, such as images, and discrete information, such as text. Therefore, in this study, we have identified some crucial areas for further research on text generation, such as incorporating a large text dataset, identifying and resolving grammatical errors, and generating extensive sentences or paragraphs. This work has also presented a detailed overview of the activation functions used in deep learning-based models and the evaluation metrics used for text generation.
引用
收藏
页码:7709 / 7742
页数:33
相关论文
共 50 条
  • [31] Survey on Mathematical Word Problem Solving Using Natural Language Processing
    Ughade, Shounaak
    Kumbhar, Satish
    PROCEEDINGS OF 2019 1ST INTERNATIONAL CONFERENCE ON INNOVATIONS IN INFORMATION AND COMMUNICATION TECHNOLOGY (ICIICT 2019), 2019,
  • [32] Strengthening INORMALS Using Context-based Natural Language Generation
    Yora, Soni
    Barmawi, Ari Moesriami
    JOURNAL OF ICT RESEARCH AND APPLICATIONS, 2022, 16 (02) : 101 - 122
  • [33] Natural Language Generation Using Deep Learning to Support MOOC Learners
    Chenglu Li
    Wanli Xing
    International Journal of Artificial Intelligence in Education, 2021, 31 : 186 - 214
  • [34] Towards the generation of hierarchical attack models from cybersecurity vulnerabilities using language models
    Sowka, Kacper
    Palade, Vasile
    Jiang, Xiaorui
    Jadidbonab, Hesam
    APPLIED SOFT COMPUTING, 2025, 171
  • [35] Natural Language Generation Using Deep Learning to Support MOOC Learners
    Li, Chenglu
    Xing, Wanli
    INTERNATIONAL JOURNAL OF ARTIFICIAL INTELLIGENCE IN EDUCATION, 2021, 31 (02) : 186 - 214
  • [36] Natural language processing in finance: A survey
    Du, Kelvin
    Zhao, Yazhi
    Mao, Rui
    Xing, Frank
    Cambria, Erik
    INFORMATION FUSION, 2025, 115
  • [37] Sentence Compression with Natural Language Generation
    Li, Peng
    Wang, Yinglin
    KNOWLEDGE ENGINEERING AND MANAGEMENT, 2011, 123 : 357 - 363
  • [38] Natural language generation of surgical procedures
    Wagner, JC
    Rogers, JE
    Baud, RH
    Scherrer, JR
    MEDINFO '98 - 9TH WORLD CONGRESS ON MEDICAL INFORMATICS, PTS 1 AND 2, 1998, 52 : 591 - 595
  • [39] Natural Language Generation from Ontologies
    Nguyen, Van
    Son, Tran Cao
    Pontelli, Enrico
    PRACTICAL ASPECTS OF DECLARATIVE LANGUAGES (PADL 2019), 2019, 11372 : 64 - 81
  • [40] Natural Language Generation and Semantic Technologies
    Staykova, Kamenka
    CYBERNETICS AND INFORMATION TECHNOLOGIES, 2014, 14 (02) : 3 - 23