CLformer: Locally grouped auto-correlation and convolutional transformer for long-term multivariate time series forecasting

被引:30
作者
Wang, Xingyu [1 ]
Liu, Hui [1 ]
Du, Junzhao [1 ]
Yang, Zhihan [1 ]
Dong, Xiyao [1 ]
机构
[1] Xidian Univ, Sch Comp Sci & Technol, Xian 710071, Peoples R China
基金
中国国家自然科学基金;
关键词
Time series forecasting; Time series decomposition; Causal convolution; Locally grouped auto-correlation; Transformer; NEURAL-NETWORKS; DECOMPOSITION; ATTENTION; MODEL;
D O I
10.1016/j.engappai.2023.106042
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Improving the performance of long-term time series forecasting is important for real-world applications. Recently, Transformer-based models have achieved significant performance gains in long-term time series prediction. However, these models are memory-intensive and cannot capture temporal patterns at multiple scales. To this end, we propose to integrate the time series decomposition method in the Transformer framework to enable the model to extract short-and long-term time patterns in more predictable seasonal and trend components. In this paper, we propose a Transformer-based model named CLformer. Different from previous methods, we exploit dilated convolutional networks to capture and refine multiple temporally repeated patterns in time series before time series decomposition. To enable the model to capture the depen-dencies at multiple scales, we propose a local group autocorrelation (LGAC) mechanism. The LGAC mechanism calculates autocorrelation within time series segments, strengthening the model's ability to capture the local temporal dynamics of series. The stacking of multiple LGAC layers enables the model to capture multi-scale dependencies, which in turn improves the model's predictive performance. The CLformer outperforms models using the global autocorrelation mechanism and self-attention in both efficiency and accuracy. Experimental results on six benchmark datasets show that our model obtains a relative performance improvement of 11.75% compared to the state-of-the-art methods. In addition, CLformer achieves a relative performance improvement of 18.89% on two datasets without apparent periodicity, demonstrating the effectiveness of our model on time series without significant periodicity.
引用
收藏
页数:13
相关论文
共 48 条
[31]   DeepAR: Probabilistic forecasting with autoregressive recurrent networks [J].
Salinas, David ;
Flunkert, Valentin ;
Gasthaus, Jan ;
Januschowski, Tim .
INTERNATIONAL JOURNAL OF FORECASTING, 2020, 36 (03) :1181-1191
[32]   An intelligent hybridization of ARIMA with machine learning models for time series forecasting [J].
Santos Junior, Domingos S. de O. ;
de Oliveira, Joao F. L. ;
de Mattos Neto, Paulo S. G. .
KNOWLEDGE-BASED SYSTEMS, 2019, 175 :72-86
[33]   Time Series Prediction Using Support Vector Machines: A Survey [J].
Sapankevych, Nicholas L. ;
Sankar, Ravi .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2009, 4 (02) :24-38
[34]   TCCT: Tightly-coupled convolutional transformer on time series forecasting [J].
Shen, Li ;
Wang, Yangzhu .
NEUROCOMPUTING, 2022, 480 :131-145
[35]   Temporal pattern attention for multivariate time series forecasting [J].
Shih, Shun-Yao ;
Sun, Fan-Keng ;
Lee, Hung-yi .
MACHINE LEARNING, 2019, 108 (8-9) :1421-1441
[36]   A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting [J].
Smyl, Slawek .
INTERNATIONAL JOURNAL OF FORECASTING, 2020, 36 (01) :75-85
[37]   Capturing combination patterns of long- and short-term dependencies in multivariate time series forecasting [J].
Song, Wen ;
Fujimura, Shigeru .
NEUROCOMPUTING, 2021, 464 :72-82
[38]  
Vaswani A, 2017, ADV NEUR IN, V30
[39]   Multivariate Temporal Convolutional Network: A Deep Neural Networks Approach for Multivariate Time Series Forecasting [J].
Wan, Renzhuo ;
Mei, Shuping ;
Wang, Jun ;
Liu, Min ;
Yang, Fan .
ELECTRONICS, 2019, 8 (08)
[40]   Wavelet Decomposition and Convolutional LSTM Networks Based Improved Deep Learning Model for Solar Irradiance Forecasting [J].
Wang, Fei ;
Yu, Yili ;
Zhang, Zhanyao ;
Li, Jie ;
Zhen, Zhao ;
Li, Kangping .
APPLIED SCIENCES-BASEL, 2018, 8 (08)