A hybrid framework for multivariate long-sequence time series forecasting

被引:6
作者
Wang, Xiaohu [1 ,2 ]
Wang, Yong [1 ,2 ]
Peng, Jianjian [1 ,2 ]
Zhang, Zhicheng [1 ,2 ]
Tang, Xueliang [3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci, Xi Yuan Ave, Chengdu 611731, Sichuan, Peoples R China
[2] Univ Elect Sci & Technol China, Engn Ctr CyberSecur, Xi Yuan Ave, Chengdu 611731, Sichuan, Peoples R China
[3] SI TECH Informat Technol Co Ltd, Beijing, Peoples R China
关键词
Time series forecasting; Time sequence decomposition; Graph attention network; Interactive learning; LEARNING APPROACH; PREDICTION;
D O I
10.1007/s10489-022-04110-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time series forecasting provides insights into the far future by utilizing the available history observations. Recent studies have demonstrated the superiority of transformer-based models in dealing with multivariate long-sequence time series forecasting (MLTSF). However, the data complexity hinders the forecasting accuracy of current deep neural network models. In this article, a hybrid framework - Waveformer - is proposed, which decomposes fluctuated and complex data sequence into multiple stable and more predictable subsequences (components) through the entire forecasting process. Waveformer interactively learns temporal dependencies on each pair of decomposed components, which enhances its ability of learning their temporal dependencies. Moreover, Waveformer treats the implicit and dynamic dependencies among variables as a set of dynamic direct graphs. Based on which, an attention adaptive graph convolution net (AAGCN) is designed, which combines self-attention and adaptive direct graph convolution to capture multivariate dynamic dependencies in a flexible manner. The experimental results on six public datasets show that Waveformer considerably outperforms a varied range of state-of-the-art benchmarks, with at the most 54.3% relative improvement.
引用
收藏
页码:13549 / 13568
页数:20
相关论文
共 29 条
[1]  
Bai L, 2020, ADV NEUR IN, V33
[2]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[3]  
Ding X, 2015, PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), P2327
[4]   LSTM: A Search Space Odyssey [J].
Greff, Klaus ;
Srivastava, Rupesh K. ;
Koutnik, Jan ;
Steunebrink, Bas R. ;
Schmidhuber, Juergen .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (10) :2222-2232
[5]  
Guo SN, 2019, AAAI CONF ARTIF INTE, P922
[6]   Second-Hand-Clothing Imports in Least-Developed-Countries: The Collapse of Local Clothing Manufacturing and Remedial Measures [J].
Guo, Shu ;
Choi, Tsan-Ming ;
Zhang, Juzhi .
IEEE TRANSACTIONS ON ENGINEERING MANAGEMENT, 2023, 70 (05) :1774-1793
[7]   Real time prediction for converter gas tank levels based on multi-output least square support vector regressor [J].
Han, Zhongyang ;
Liu, Ying ;
Zhao, Jun ;
Wang, Wei .
CONTROL ENGINEERING PRACTICE, 2012, 20 (12) :1400-1409
[8]   A seasonal-trend decomposition-based dendritic neuron model for financial time series prediction [J].
He, Houtian ;
Gao, Shangce ;
Jin, Ting ;
Sato, Syuhei ;
Zhang, Xingyi .
APPLIED SOFT COMPUTING, 2021, 108
[9]  
Hoermann S, 2018, IEEE INT CONF ROBOT, P2056
[10]   Time series forecasting based on wavelet filtering [J].
Joo, Tae Woo ;
Kim, Seoung Bum .
EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (08) :3868-3874