Online Attention Enhanced Differential and Decomposed LSTM for Time Series Prediction

被引:2
作者
Li, Lina [1 ]
Huang, Shengkui [1 ]
Liu, Guoxing [2 ]
Luo, Cheng [1 ]
Yu, Qinghe [1 ]
Li, Nianfeng [1 ]
机构
[1] Changchun Univ, Coll Comp Sci & Technol, Changchun 130022, Peoples R China
[2] Changchun Univ, Coll Cyber Secur, Changchun 130022, Peoples R China
关键词
Online prediction; time series; LSTM; self-attention mechanism; difference and decomposition; NEURAL-NETWORKS; MODEL; ARCHITECTURE;
D O I
10.1109/ACCESS.2024.3395651
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to the time variability and bursty of data, accurate and lag-free time series prediction is difficult and challenging. To address these problems, we propose an online attention enhanced differential and decomposed LSTM (Long Short Term Memory) model called OADDL, which can better capture the comprehensive core features and important structures of time series. In this model, the core features of the time series are first generated through differential and decomposition methods to reduce data complexity and remove noisy data. Then, the self-attention module and LSTM capture the full time core features and important structures of time series. Finally, FCN (Fully Connected Network) fuses the omnidirectional features of time series. Meanwhile, we design an online two-stage training mode for this model, in which attention enhanced LSTM and FCN models are sequentially trained, and the training set and model hyper-parameters are continuously updated over time, thus further capturing the time-varying and burst characteristics of time series. We conduct tests on three typical datasets, and the experimental results show that compared with latest typical deep learning models, OADDL can more accurately predict time series data and effectively alleviate the problem of prediction lag.
引用
收藏
页码:62416 / 62428
页数:13
相关论文
共 59 条
[1]   ARIMA-based decoupled time series forecasting of electric vehicle charging demand for stochastic power system operation [J].
Amini, M. Hadi ;
Kargarian, Amin ;
Karabasoglu, Orkun .
ELECTRIC POWER SYSTEMS RESEARCH, 2016, 140 :378-390
[2]  
Anava O., 2013, C LEARN THEOR, P172
[3]   A Neuron-Based Kalman Filter with Nonlinear Autoregressive Model [J].
Bai, Yu-ting ;
Wang, Xiao-yi ;
Jin, Xue-bo ;
Zhao, Zhi-yao ;
Zhang, Bai-hai .
SENSORS, 2020, 20 (01)
[4]   ARMA TIME-SERIES MODELING - AN EFFECTIVE METHOD [J].
CADZOW, JA .
IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 1983, 19 (01) :49-58
[5]   Evaluation of co-evolutionary neural network architectures for time series prediction with mobile application in finance [J].
Chandra, Rohitash ;
Chand, Shelvin .
APPLIED SOFT COMPUTING, 2016, 49 :462-473
[6]  
Cleveland RB., 1990, J. Off. Stat., V6, P3, DOI DOI 10.1007/978-1-4613-4499-5_24
[7]  
Dagum EB, 2016, STAT SOC BEHAV SC, P1, DOI 10.1007/978-3-319-31822-6
[8]  
Ding DS, 2021, P AMER CONTR CONF, P1547, DOI 10.23919/ACC50511.2021.9482721
[9]   Deep learning with long short-term memory networks for financial market predictions [J].
Fischer, Thomas ;
Krauss, Christopher .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2018, 270 (02) :654-669
[10]   Variational gated autoencoder-based feature extraction model for inferring disease-miRNA associations based on multiview features [J].
Guo, Yanbu ;
Zhou, Dongming ;
Ruan, Xiaoli ;
Cao, Jinde .
NEURAL NETWORKS, 2023, 165 :491-505