Online Attention Enhanced Differential and Decomposed LSTM for Time Series Prediction

被引:2
|
作者
Li, Lina [1 ]
Huang, Shengkui [1 ]
Liu, Guoxing [2 ]
Luo, Cheng [1 ]
Yu, Qinghe [1 ]
Li, Nianfeng [1 ]
机构
[1] Changchun Univ, Coll Comp Sci & Technol, Changchun 130022, Peoples R China
[2] Changchun Univ, Coll Cyber Secur, Changchun 130022, Peoples R China
关键词
Online prediction; time series; LSTM; self-attention mechanism; difference and decomposition; NEURAL-NETWORKS; MODEL; ARCHITECTURE;
D O I
10.1109/ACCESS.2024.3395651
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to the time variability and bursty of data, accurate and lag-free time series prediction is difficult and challenging. To address these problems, we propose an online attention enhanced differential and decomposed LSTM (Long Short Term Memory) model called OADDL, which can better capture the comprehensive core features and important structures of time series. In this model, the core features of the time series are first generated through differential and decomposition methods to reduce data complexity and remove noisy data. Then, the self-attention module and LSTM capture the full time core features and important structures of time series. Finally, FCN (Fully Connected Network) fuses the omnidirectional features of time series. Meanwhile, we design an online two-stage training mode for this model, in which attention enhanced LSTM and FCN models are sequentially trained, and the training set and model hyper-parameters are continuously updated over time, thus further capturing the time-varying and burst characteristics of time series. We conduct tests on three typical datasets, and the experimental results show that compared with latest typical deep learning models, OADDL can more accurately predict time series data and effectively alleviate the problem of prediction lag.
引用
收藏
页码:62416 / 62428
页数:13
相关论文
共 50 条
  • [1] Time Series Prediction Based on LSTM-Attention-LSTM Model
    Wen, Xianyun
    Li, Weibang
    IEEE ACCESS, 2023, 11 : 48322 - 48331
  • [2] EA-LSTM: Evolutionary attention-based LSTM for time series prediction
    Li, Youru
    Zhu, Zhenfeng
    Kong, Deqiang
    Han, Hua
    Zhao, Yao
    KNOWLEDGE-BASED SYSTEMS, 2019, 181
  • [3] FEDAF: frequency enhanced decomposed attention free transformer for long time series forecasting
    Yang X.
    Li H.
    Huang X.
    Feng X.
    Neural Computing and Applications, 2024, 36 (26) : 16271 - 16288
  • [4] Prediction for Time Series with CNN and LSTM
    Jin, Xuebo
    Yu, Xinghong
    Wang, Xiaoyi
    Bai, Yuting
    Su, Tingli
    Kong, Jianlei
    PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON MODELLING, IDENTIFICATION AND CONTROL (ICMIC2019), 2020, 582 : 631 - 641
  • [5] Hydrological Time Series Prediction Model Based on Attention-LSTM Neural Network
    Li, Yiran
    Yang, Juan
    PROCEEDINGS OF THE 2019 2ND INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2019), 2019, : 21 - 25
  • [6] CDA-LSTM: an evolutionary convolution-based dual-attention LSTM for univariate time series prediction
    Xiaoquan Chu
    Haibin Jin
    Yue Li
    Jianying Feng
    Weisong Mu
    Neural Computing and Applications, 2021, 33 : 16113 - 16137
  • [7] CDA-LSTM: an evolutionary convolution-based dual-attention LSTM for univariate time series prediction
    Chu, Xiaoquan
    Jin, Haibin
    Li, Yue
    Feng, Jianying
    Mu, Weisong
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 16113 - 16137
  • [8] Time Series Prediction with Autoencoding LSTM Networks
    Succetti, Federico
    Ceschini, Andrea
    Di Luzio, Francesco
    Rosato, Antonello
    Panella, Massimo
    ADVANCES IN COMPUTATIONAL INTELLIGENCE (IWANN 2021), PT II, 2021, 12862 : 306 - 317
  • [9] A Hybrid Attention-Based EMD-LSTM Model for Financial Time Series Prediction
    Chen, Lu
    Chi, Yonggang
    Guan, Yingying
    Fan, Jialin
    2019 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA (ICAIBD 2019), 2019, : 113 - 118
  • [10] An Enhanced LSTM for Trend Following of Time Series
    Hu, Yao
    Sun, Xiaoyan
    Nie, Xin
    Li, Yuzhu
    Liu, Lian
    IEEE ACCESS, 2019, 7 : 34020 - 34030