Graph correlated attention recurrent neural network for multivariate time series forecasting

被引:42
作者
Geng, Xiulin [1 ]
He, Xiaoyu [1 ]
Xu, Lingyu [1 ,2 ]
Yu, Jie [1 ]
机构
[1] Shanghai Univ, Sch Comp Engn & Sci, Shanghai, Peoples R China
[2] Shanghai Univ, Shanghai Inst Adv Commun & Data Sci, Shanghai, Peoples R China
关键词
Multivariate time series; Feature -level attention; Graph attention; Multi -level attention; Memory ability; MODELS;
D O I
10.1016/j.ins.2022.04.045
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multivariate time series(MTS) forecasting is an urgent problem for numerous valuable applications. At present, attention-based methods can relieve recurrent neural networks' limitations in MTS forecasting that are hard to focus on key information and capture long-term dependencies, but they fail to learn the time-varying pattern based on the reli-able interaction. To reinforce the memory ability of key features across time, we propose a Graph Correlated Attention Recurrent Neural Network(GCAR). GCAR first nests Feature -level attention in the graph attention module to complement external feature representa-tions on the extraction of multi-head temporal correlations. Then Multi-level attention is designed to add target factors' impact on the selection of external correlation and achieve a fine-grained distinction of external features' contribution. To better capture different ser-ies' continuous dynamic changes, two parallel LSTMs are respectively applied to learn his-torical target series and external feature representations' temporal dependencies. Finally, a fusion gate is employed to balance their information conflicts. The performance of GCAR model is tested on 4 datasets, and results show GCAR model performs the most stable and greatest predictive accuracy as the increasing of predicted horizons compared with state-of-the-art models even if the multivariate time series present strong volatility and randomness.(c) 2022 Published by Elsevier Inc.
引用
收藏
页码:126 / 142
页数:17
相关论文
共 47 条
[11]   Correlational graph attention-based Long Short-Term Memory network for multivariate time series prediction [J].
Han, Shuang ;
Dong, Hongbin ;
Teng, Xuyang ;
Li, Xiaohui ;
Wang, Xiaowei .
APPLIED SOFT COMPUTING, 2021, 106
[12]   A Tutorial on Estimating Time-Varying Vector Autoregressive Models [J].
Haslbeck, Jonas M. B. ;
Bringmann, Laura F. ;
Waldorp, Lourens J. .
MULTIVARIATE BEHAVIORAL RESEARCH, 2020, 56 (01) :120-149
[13]  
Hogue J., **DATA OBJECT**
[14]   A deep learning model to effectively capture mutation information in multivariate time series prediction [J].
Hu, Jun ;
Zheng, Wendong .
KNOWLEDGE-BASED SYSTEMS, 2020, 203
[15]   Heterogeneous Graph Transformer [J].
Hu, Ziniu ;
Dong, Yuxiao ;
Wang, Kuansan ;
Sun, Yizhou .
WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, :2704-2710
[16]   Deep Learning with Long Short-Term Memory for Time Series Prediction [J].
Hua, Yuxiu ;
Zhao, Zhifeng ;
Li, Rongpeng ;
Chen, Xianfu ;
Liu, Zhiming ;
Zhang, Honggang .
IEEE COMMUNICATIONS MAGAZINE, 2019, 57 (06) :114-119
[17]   Transductive LSTM for time-series prediction: An application to weather forecasting [J].
Karevan, Zahra ;
Suykens, Johan A. K. .
NEURAL NETWORKS, 2020, 125 :1-9
[18]   A comparative study of series arima/mlp hybrid models for stock price forecasting [J].
Khashei, Mehdi ;
Hajirahimi, Zahra .
COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2019, 48 (09) :2625-2640
[19]   Spatio-Temporal Graph Deep Neural Network for Short-Term Wind Speed Forecasting [J].
Khodayar, Mahdi ;
Wang, Jianhui .
IEEE TRANSACTIONS ON SUSTAINABLE ENERGY, 2019, 10 (02) :670-681
[20]   DTDR-ALSTM: Extracting dynamic time-delays to reconstruct multivariate data for improving attention-based LSTM industrial time series prediction models [J].
Li, Jince ;
Yang, Bo ;
Li, Hongguang ;
Wang, Yongjian ;
Qi, Chu ;
Liu, Yi .
KNOWLEDGE-BASED SYSTEMS, 2021, 211 (211)