Natural Language Generation Using Sequential Models: A Survey

被引:0
|
作者
Abhishek Kumar Pandey
Sanjiban Sekhar Roy
机构
[1] Vellore Institute of Technology,School of Computer Science and Engineering
[2] Vellore,undefined
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Natural language processing; Long term short-term memory; Natural language generation; Recurrent neural network; Sequential generative model; Story generation;
D O I
暂无
中图分类号
学科分类号
摘要
Natural Language Generation (NLG) is one of the most critical yet challenging tasks in all Natural Language Processing applications. It is a process to automate text generation so that humans can understand its meaning. A handful of research articles published in the literature have described how NLG can produce understandable texts in various languages. The use of sequence-to-sequence modeling powered by deep learning techniques such as Long Term Short Term Memory, Recurrent Neural Networks, and Gated Recurrent Units has received much popularity as text generators. This survey provides a comprehensive overview of text generations and their related techniques, such as statistical, traditional, and neural network-based techniques. Generating text using the sequence-to-sequence model is not a simple task as it needs to handle continuous data, such as images, and discrete information, such as text. Therefore, in this study, we have identified some crucial areas for further research on text generation, such as incorporating a large text dataset, identifying and resolving grammatical errors, and generating extensive sentences or paragraphs. This work has also presented a detailed overview of the activation functions used in deep learning-based models and the evaluation metrics used for text generation.
引用
收藏
页码:7709 / 7742
页数:33
相关论文
共 50 条
  • [21] Adversarial Attacks on Deep-learning Models in Natural Language Processing: A Survey
    Zhang, Wei Emma
    Sheng, Quan Z.
    Alhazmi, Ahoud
    Li, Chenliang
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2020, 11 (03)
  • [22] Generation of natural language text using perspective descriptor in frames
    Uma, GV
    Geetha, TV
    IETE JOURNAL OF RESEARCH, 2001, 47 (1-2) : 43 - 57
  • [23] Objective Type Question Generation using Natural Language Processing
    Deena, G.
    Raja, K.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (02) : 539 - 548
  • [24] Using the conceptual graphs operations for natural language generation in medicine
    Wagner, JC
    Baud, RH
    Scherrer, JR
    CONCEPTUAL STRUCTURES: APPLICATIONS, IMPLEMENTATION AND THEORY, 1995, 954 : 115 - 128
  • [25] Natural Language Generation Using Monte Carlo Tree Search
    Kumagai, Kaori
    Kobayashi, Ichiro
    Mochihashi, Daichi
    Asoh, Hideki
    Nakamura, Tomoaki
    Nagai, Takayuki
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2018, 22 (05) : 777 - 785
  • [26] Aggregation in natural language generation
    Dalianis, H
    COMPUTATIONAL INTELLIGENCE, 1999, 15 (04) : 384 - 414
  • [27] Natural Language Generation for Sponsored-Search Advertisements
    Bartz, Kevin
    Barr, Cory
    Aijaz, Adil
    EC'08: PROCEEDINGS OF THE 2008 ACM CONFERENCE ON ELECTRONIC COMMERCE, 2008, : 1 - 9
  • [28] Predictability and Causality in Spanish and English Natural Language Generation
    Busto-Castineira, Andrea
    Javier Gonzalez-Castano, Francisco
    Garcia-Mendez, Silvia
    de Arriba-Perez, Francisco
    IEEE ACCESS, 2024, 12 : 132521 - 132532
  • [29] A Survey of Pretrained Language Models
    Sun, Kaili
    Luo, Xudong
    Luo, Michael Y.
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, 2022, 13369 : 442 - 456
  • [30] A Study on Flexibility in Natural Language Generation Through a Statistical Approach to Story Generation
    Vicente, Marta
    Barros, Cristina
    Lloret, Elena
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, NLDB 2017, 2017, 10260 : 492 - 498