How Features Benefit: Parallel Series Embedding for Multivariate Time Series Forecasting with Transformer

被引:2
作者
Feng, Xuande [1 ]
Lyu, Zonglin [1 ]
机构
[1] Columbia Univ, Dept Ind Engn & Operat Res, New York, NY 10027 USA
来源
2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI | 2022年
关键词
Multivariate time series; Deep learning; Parallel series embedding; Transformer; MODELS;
D O I
10.1109/ICTAI56018.2022.00148
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Forecasting time series is an engaging and vital mathematical topic. Theories and applications in related fields have been studied for decades, and deep learning has provided reliable tools in recent years. Transformer, capable to capture longer sequence dependencies, was exploited as a powerful architecture in time series forecasting. While existing work majorly contributed to breaking memory bottleneck of Trasnformer, how to effectively leverage multivariate time series remains barely focused. In this work, a novel architecture utilizing a primary Transformer is proposed to conduct multivariate time series predictions. Our proposed architecture has two main advantages. Firstly, it accurately predicts multivariate time series with shorter or longer sequence lengths and steps. We benchmark our proposed model with various baseline architectures on realworld datasets, and our model improved their performances significantly. Secondly, it can easily be leveraged in Transformerbased variants, which guarantees broad applications of our proposed work.
引用
收藏
页码:967 / 975
页数:9
相关论文
共 48 条
[1]   Stock Price Prediction Using the ARIMA Model [J].
Adebiyi, Ayodele A. ;
Adewumi, Aderemi O. ;
Ayo, Charles K. .
2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION (UKSIM), 2014, :106-112
[2]   FITTING AUTOREGRESSIVE MODELS FOR PREDICTION [J].
AKAIKE, H .
ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 1969, 21 (02) :243-&
[3]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[4]  
Bao H., 2021, arXiv
[5]  
Mozer MC, 2017, Arxiv, DOI arXiv:1710.04110
[6]   Data driven prediction models of energy use of appliances in a low-energy house [J].
Candanedo, Luis M. ;
Feldheim, Veronique ;
Deramaix, Dominique .
ENERGY AND BUILDINGS, 2017, 140 :81-97
[7]   End-to-End Object Detection with Transformers [J].
Carion, Nicolas ;
Massa, Francisco ;
Synnaeve, Gabriel ;
Usunier, Nicolas ;
Kirillov, Alexander ;
Zagoruyko, Sergey .
COMPUTER VISION - ECCV 2020, PT I, 2020, 12346 :213-229
[8]   Recurrent Neural Networks for Multivariate Time Series with Missing Values [J].
Che, Zhengping ;
Purushotham, Sanjay ;
Cho, Kyunghyun ;
Sontag, David ;
Liu, Yan .
SCIENTIFIC REPORTS, 2018, 8
[9]  
Chen Ricky T. Q., 2018, Advances in Neural Information Processing Systems, V31
[10]  
Chen X., 2021, arXiv