Time Series Prediction Based on LSTM-Attention-LSTM Model

被引:41
作者
Wen, Xianyun
Li, Weibang [1 ]
机构
[1] Southwest Minzu Univ, Key Lab Comp Syst State Ethn Affairs Commiss, Chengdu 610041, Peoples R China
关键词
Time series forecasting; long short-term memory networks; encoder and decoder model; attention mechanisms;
D O I
10.1109/ACCESS.2023.3276628
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data forecasting. Aiming at the shortcomings of existing methods, in this paper we propose a new time series forecasting model LSTM-attention-LSTM. The model uses two LSTM models as the encoder and decoder, and introduces an attention mechanism between the encoder and decoder. The model has two distinctive features: first, by using the attention mechanism to calculate the interrelationship between sequence data, it overcomes the disadvantage of the coder-and-decoder model in that the decoder cannot obtain sufficiently long input sequences; second, it is suitable for sequence forecasting with long time steps. In this paper we validate the proposed model based on several real data sets, and the results show that the LSTM-attention-LSTM model is more accurate than some currently dominant models in prediction. The experiment also assessed the effect of the attention mechanism at different time steps by varying the time step.
引用
收藏
页码:48322 / 48331
页数:10
相关论文
共 31 条
  • [1] [Anonymous], 2022, DAIL GOLD PRIC DAT
  • [2] [Anonymous], 2022, DAIL STOCK DAT
  • [3] Box G., 2015, TIME SERIES ANAL FOR, P89
  • [4] ByoungSeon C., 2012, ARMA MODEL IDENTIFIC, P23
  • [5] Cai Kai, 2008, Power System Technology, V32, P82
  • [6] Evaluation of Deep Learning Models for Multi-Step Ahead Time Series Prediction
    Chandra, Rohitash
    Goyal, Shaurya
    Gupta, Rishabh
    [J]. IEEE ACCESS, 2021, 9 : 83105 - 83123
  • [7] Chen Song Xi, 2017, UCI Machine Learning Repository
  • [8] Chen Song Xi, 2017, UCI Machine Learning Repository
  • [9] Cho KYHY, 2014, Arxiv, DOI [arXiv:1409.1259, DOI 10.3115/V1/W14-40121409.1259, DOI 10.48550/ARXIV.1409.1259]
  • [10] Dai S., 2020, J SHENZHEN U SCI ENG, V37, P221, DOI 10.3724/sp.j.1249.2020.03265