Time Series Prediction Based on LSTM-Attention-LSTM Model

被引:39
|
作者
Wen, Xianyun
Li, Weibang [1 ]
机构
[1] Southwest Minzu Univ, Key Lab Comp Syst State Ethn Affairs Commiss, Chengdu 610041, Peoples R China
关键词
Time series forecasting; long short-term memory networks; encoder and decoder model; attention mechanisms;
D O I
10.1109/ACCESS.2023.3276628
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data forecasting. Aiming at the shortcomings of existing methods, in this paper we propose a new time series forecasting model LSTM-attention-LSTM. The model uses two LSTM models as the encoder and decoder, and introduces an attention mechanism between the encoder and decoder. The model has two distinctive features: first, by using the attention mechanism to calculate the interrelationship between sequence data, it overcomes the disadvantage of the coder-and-decoder model in that the decoder cannot obtain sufficiently long input sequences; second, it is suitable for sequence forecasting with long time steps. In this paper we validate the proposed model based on several real data sets, and the results show that the LSTM-attention-LSTM model is more accurate than some currently dominant models in prediction. The experiment also assessed the effect of the attention mechanism at different time steps by varying the time step.
引用
收藏
页码:48322 / 48331
页数:10
相关论文
共 50 条
  • [1] EA-LSTM: Evolutionary attention-based LSTM for time series prediction
    Li, Youru
    Zhu, Zhenfeng
    Kong, Deqiang
    Han, Hua
    Zhao, Yao
    KNOWLEDGE-BASED SYSTEMS, 2019, 181
  • [2] Hydrological Time Series Prediction Model Based on Attention-LSTM Neural Network
    Li, Yiran
    Yang, Juan
    PROCEEDINGS OF THE 2019 2ND INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2019), 2019, : 21 - 25
  • [3] A Hybrid Attention-Based EMD-LSTM Model for Financial Time Series Prediction
    Chen, Lu
    Chi, Yonggang
    Guan, Yingying
    Fan, Jialin
    2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD 2019), 2019, : 113 - 118
  • [4] CDA-LSTM: an evolutionary convolution-based dual-attention LSTM for univariate time series prediction
    Xiaoquan Chu
    Haibin Jin
    Yue Li
    Jianying Feng
    Weisong Mu
    Neural Computing and Applications, 2021, 33 : 16113 - 16137
  • [5] CDA-LSTM: an evolutionary convolution-based dual-attention LSTM for univariate time series prediction
    Chu, Xiaoquan
    Jin, Haibin
    Li, Yue
    Feng, Jianying
    Mu, Weisong
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 16113 - 16137
  • [6] FD-LSTM: A Fuzzy LSTM Model for Chaotic Time-Series Prediction
    Langeroudi, Milad Keshtkar
    Yamaghani, Mohammad Reza
    Khodaparast, Siavash
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (04) : 70 - 78
  • [7] A stream prediction model based on attention-LSTM
    Yan, Le
    Chen, Changwei
    Hang, Tingting
    Hu, Youchuan
    EARTH SCIENCE INFORMATICS, 2021, 14 (02) : 723 - 733
  • [8] A stream prediction model based on attention-LSTM
    Le Yan
    Changwei Chen
    Tingting Hang
    Youchuan Hu
    Earth Science Informatics, 2021, 14 : 723 - 733
  • [9] Online Attention Enhanced Differential and Decomposed LSTM for Time Series Prediction
    Li, Lina
    Huang, Shengkui
    Liu, Guoxing
    Luo, Cheng
    Yu, Qinghe
    Li, Nianfeng
    IEEE ACCESS, 2024, 12 (62416-62428) : 62416 - 62428
  • [10] Vulnerability Time Series Prediction Based on Multivariable LSTM
    Wu, Shuang
    Wang, Congyi
    Zeng, Jianping
    Wu, Chengrong
    2020 IEEE 14TH INTERNATIONAL CONFERENCE ON ANTI-COUNTERFEITING, SECURITY, AND IDENTIFICATION (ASID), 2020, : 185 - +