NEWLSTM: An Optimized Long Short-Term Memory Language Model for Sequence Prediction

被引:18
作者
Wang, Qing [1 ]
Peng, Rong-Qun [1 ]
Wang, Jia-Qiang [2 ]
Li, Zhi [3 ]
Qu, Han-Bing [2 ]
机构
[1] Shandong Univ Technol, Sch Comp Sci & Technol, Zibo 255049, Peoples R China
[2] Beijing Acad Sci & Technol, Key Lab Artificial Intelligence & Data Anal, Beijing 100094, Peoples R China
[3] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100049, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Logic gates; Recurrent neural networks; Task analysis; Predictive models; Natural language processing; Context modeling; Data models; Gate fusion; exploding gradient; long short-term memory; recurrent neural network; NETWORKS;
D O I
10.1109/ACCESS.2020.2985418
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The long short-term memory (LSTM) model trained on the universal language modeling task overcomes the bottleneck of vanishing gradients in the traditional recurrent neural network (RNN) and shows excellent performance in processing multiple tasks generated by natural language processing. Although LSTM effectively alleviates the vanishing gradient problem in the RNN, the information will be greatly lost in the long distance transmission, and there are still some limitations in its practical use. In this paper, we propose a new model called NEWLSTM, which improves the LSTM model, and alleviates the defects of too many parameters in LSTM and the vanishing gradient. The NEWLSTM model directly correlates the cell state information with current information. The traditional LSTM & x2019;s input gate and forget gate are integrated, some components are deleted, the problems of too many LSTM parameters and complicated calculations are solved, and the iteration time is effectively reduced. In this paper, a neural network model is used to identify the relationship between input information sequences to predict the language sequence. The experimental results show that the improved new model is simpler than traditional LSTM models and LSTM variants on multiple test sets. NEWLSTM has better overall stability and can better solve the sparse words problem.
引用
收藏
页码:65395 / 65401
页数:7
相关论文
共 31 条
  • [1] Fuzzy Ontology and LSTM-Based Text Mining: A Transportation Network Monitoring System for Assisting Travel
    Ali, Farman
    El-Sappagh, Shaker
    Kwak, Daehan
    [J]. SENSORS, 2019, 19 (02)
  • [2] [Anonymous], ARXIV190209781
  • [3] [Anonymous], P 31 INT C MACH LEAR
  • [4] [Anonymous], ARXIV190302082
  • [5] [Anonymous], ARXIV181211391
  • [6] [Anonymous], 2013, COMPUT SCI
  • [7] [Anonymous], ARXIV160309025
  • [8] [Anonymous], 2019, P 2019 C EMP METH NA
  • [9] [Anonymous], ARXIV190505513
  • [10] [Anonymous], 2017, ARXIV171002224