The survey: Text generation models in deep learning

被引:94
作者
Iqbal, Touseef [1 ]
Qureshi, Shaima [1 ]
机构
[1] Natl Inst Technol, Dept Comp Sci & Engn, Srinagar, Jammu & Kashmir, India
关键词
Natural Language Processing (NLP); Deep learning; Word embeddings; Recurrent Neural Networks (RNNs); Convolutional Neural Networks (CNNs); Variational Auto-Encoders (VAEs); Generative Adversarial Networks (GANs); Text generation techniques; Activation functions; Optimization techniques;
D O I
10.1016/j.jksuci.2020.04.001
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning methods possess many processing layers to understand the stratified representation of data and have achieved state-of-art results in several domains. Recently, deep learning model designs and architectures have unfolded in the context of Natural Language Processing (NLP). This survey presents a brief description of the advances that have occurred in the area of Deep Generative modeling. This work considers most of the papers from 2015 onwards. In this paper, we review many deep learning models that have been used for the generation of text. We also summarize the various models and have put forward the detailed understanding of past, present, and future of text generation models in deep learning. Furthermore, DL approaches that have been explored and evaluated in different application domains in NLP are included in this survey. (C) 2020 The Authors. Published by Elsevier B.V. on behalf of King Saud University.
引用
收藏
页码:2515 / 2528
页数:14
相关论文
共 110 条
[1]  
Agarap A. F., 2018, DEEP LEARNING USING, P2, DOI 10.48550/arXiv.1803.08375
[2]   SPICE: Semantic Propositional Image Caption Evaluation [J].
Anderson, Peter ;
Fernando, Basura ;
Johnson, Mark ;
Gould, Stephen .
COMPUTER VISION - ECCV 2016, PT V, 2016, 9909 :382-398
[3]  
[Anonymous], 2016, WHAT ARE VARIATIONAL
[4]  
[Anonymous], 2008, P 25 INT C MACH LEAR, DOI DOI 10.1145/1390156.1390177
[5]  
[Anonymous], 2011, P 28 INT C MACH LEAR
[6]  
[Anonymous], 2013, Sequential Monte Carlo methods in practice
[7]  
[Anonymous], 2016, P 30 C NEUR INF PROC
[8]  
[Anonymous], 2008, AAAI
[9]  
[Anonymous], 2011, ICML, DOI DOI 10.5555/3104482.3104610
[10]  
[Anonymous], 2014, ARXIV PREPRINT ARXIV