Multivariate Time Series Forecasting Using Multiscale Recurrent Networks With Scale Attention and Cross-Scale Guidance

被引:12
作者
Guo, Qiang [1 ,2 ]
Fang, Lexin [3 ]
Wang, Ren [3 ]
Zhang, Caiming [3 ,4 ]
机构
[1] Shandong Univ Finance & Econ, Sch Comp Sci, Jinan 250014, Peoples R China
[2] Shandong Univ Finance & Econ, Shandong Prov Key Lab Digital Media Technol, Jinan 250014, Peoples R China
[3] Shandong Univ, Sch Software, Jinan 250100, Peoples R China
[4] Shandong Prov Lab Future Intelligence & Financial, Yantai 264005, Peoples R China
基金
中国国家自然科学基金;
关键词
Attention mechanism; cross-scale guidance; multiscale decomposition; recurrent neural networks (RNNs); time series forecasting; SINGULAR SPECTRUM ANALYSIS;
D O I
10.1109/TNNLS.2023.3326140
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multivariate time series (MTS) forecasting is considered as a challenging task due to complex and nonlinear interdependencies between time steps and series. With the advance of deep learning, significant efforts have been made to model long-term and short-term temporal patterns hidden in historical information by recurrent neural networks (RNNs) with a temporal attention mechanism. Although various forecasting models have been developed, most of them are single-scale oriented, resulting in scale information loss. In this article, we seamlessly integrate multiscale analysis into deep learning frameworks to build scale-aware recurrent networks and propose two multiscale recurrent network (MRN) models for MTS forecasting. The first model called MRN-SA adopts a scale attention mechanism to dynamically select the most relevant information from different scales and simultaneously employs input attention and temporal attention to make predictions. The second one named as MRN-CSG introduces a novel cross-scale guidance mechanism to exploit the information from coarse scale to guide the decoding process at fine scale, which results in a lightweight and more easily trained model without obvious loss of accuracy. Extensive experimental results demonstrate that both MRN-SA and MRN-CSG can achieve state-of-the-art performance on five typical MTS datasets in different domains. The source codes will be publicly available at https://github.com/qguo2010/MRN.
引用
收藏
页码:540 / 554
页数:15
相关论文
共 46 条
[1]  
Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
[2]  
Bahdanau D, 2016, Arxiv, DOI [arXiv:1409.0473, DOI 10.48550/ARXIV.1409.0473]
[3]   LSTM-MSNet: Leveraging Forecasts on Sets of Related Time Series With Multiple Seasonal Patterns [J].
Bandara, Kasun ;
Bergmeir, Christoph ;
Hewamalage, Hansika .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (04) :1586-1599
[4]   LEARNING LONG-TERM DEPENDENCIES WITH GRADIENT DESCENT IS DIFFICULT [J].
BENGIO, Y ;
SIMARD, P ;
FRASCONI, P .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :157-166
[5]  
Box George E.P., 1976, Time series for data sciences: Analysis and forecasting
[6]   Support vector machine with adaptive parameters in financial time series forecasting [J].
Cao, LJ ;
Tay, FEH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (06) :1506-1518
[7]   Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Part 1 Low-Rank Tensor Decompositions [J].
Cichocki, Andrzej ;
Lee, Namgil ;
Oseledets, Ivan ;
Anh-Huy Phan ;
Zhao, Qibin ;
Mandic, Danilo P. .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2016, 9 (4-5) :I-+
[8]   Position-Based Content Attention for Time Series Forecasting with Sequence-to-Sequence RNNs [J].
Cinar, Yagmur Gizem ;
Mirisaee, Hamid ;
Goswami, Parantapa ;
Gaussier, Eric ;
Ait-Bachir, Ali ;
Strijov, Vadim .
NEURAL INFORMATION PROCESSING, ICONIP 2017, PT V, 2017, 10638 :533-544
[9]   Temporal Attention-Augmented Bilinear Network for Financial Time-Series Data Analysis [J].
Dat Thanh Tran ;
Iosifidis, Alexandros ;
Kanniainen, Juho ;
Gabbouj, Moncef .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (05) :1407-1418
[10]  
Durbin James, 2012, Time Series Analysis by State Space Methods