Foreformer: an enhanced transformer-based framework for multivariate time series forecasting

被引:14
|
作者
Yang, Ye [1 ]
Lu, Jiangang [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Control Sci & Engn, State Key Lab Ind Control Technol, Hangzhou 310027, Peoples R China
[2] Zhejiang Lab, Hangzhou 311121, Peoples R China
关键词
Multivariate time series forecasting; Attention mechanism; Deep learning; Multi-resolution; Static covariate; Transformer; CONVOLUTIONAL NETWORKS;
D O I
10.1007/s10489-022-04100-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multivariate time series forecasting (MTSF) has been extensively studied throughout years with ubiquitous applications in finance, traffic, environment, etc. Recent investigations have demonstrated the potential of Transformer to improve the forecasting performance. Transformer, however, has limitations that prohibit it from being directly applied to MTSF, such as insufficient extraction of temporal patterns at different time scales, extraction of irrelevant information in the self-attention, and no targeted processing of static covariates. Motivated by above, an enhanced Transformer-based framework for MTSF is proposed, named Foreformer, with three distinctive characteristics: (i) a multi-temporal resolution module that deeply captures temporal patterns at different scales, (ii) an explicit sparse attention mechanism forces model to prioritize the most contributive components, and (iii) a static covariates processing module for nonlinear processing of static covariates. Extensive experiments on three real-world datasets demonstrate that Foreformer outperforms existing methodologies, making it a reliable approach for MTSF tasks.
引用
收藏
页码:12521 / 12540
页数:20
相关论文
共 50 条
  • [41] Multi-Scale Transformer Pyramid Networks for Multivariate Time Series Forecasting
    Zhang, Yifan
    Wu, Rui
    Dascalu, Sergiu M.
    Harris, Frederick C.
    IEEE ACCESS, 2024, 12 : 14731 - 14741
  • [42] FEDAF: frequency enhanced decomposed attention free transformer for long time series forecasting
    Yang X.
    Li H.
    Huang X.
    Feng X.
    Neural Computing and Applications, 2024, 36 (26) : 16271 - 16288
  • [43] SiET: Spatial information enhanced transformer for multivariate time series detection
    Xiong, Weixuan
    Wang, Peng
    Sun, Xiaochen
    Wang, Jun
    KNOWLEDGE-BASED SYSTEMS, 2024, 296
  • [44] Transformer-Based Model for Electrical Load Forecasting
    L'Heureux, Alexandra
    Grolinger, Katarina
    Capretz, Miriam A. M.
    ENERGIES, 2022, 15 (14)
  • [45] A Transformer-Based Bridge Structural Response Prediction Framework
    Li, Ziqi
    Li, Dongsheng
    Sun, Tianshu
    SENSORS, 2022, 22 (08)
  • [46] W-Transformers : A Wavelet-based Transformer Framework for Univariate Time Series Forecasting
    Sasal, Lena
    Chakraborty, Tanujit
    Hadid, Abdenour
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 671 - 676
  • [47] A Bi-GRU-based encoder-decoder framework for multivariate time series forecasting
    Balti, Hanen
    Ben Abbes, Ali
    Farah, Imed Riadh
    SOFT COMPUTING, 2024, 28 (9-10) : 6775 - 6786
  • [48] Multivariate time series forecasting via attention-based encoder-decoder framework
    Du, Shengdong
    Li, Tianrui
    Yang, Yan
    Horng, Shi-Jinn
    NEUROCOMPUTING, 2020, 388 (388) : 269 - 279
  • [49] DyGraphformer: Transformer combining dynamic spatio-temporal graph network for multivariate time series forecasting
    Han, Shuo
    Xun, Yaling
    Cai, Jianghui
    Yang, Haifeng
    Li, Yanfeng
    NEURAL NETWORKS, 2025, 181
  • [50] General Time Transformer: an Encoder-only Foundation Model for Zero-Shot Multivariate Time Series Forecasting
    Feng, Cheng
    Huang, Long
    Krompass, Denis
    PROCEEDINGS OF THE 33RD ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2024, 2024, : 3757 - 3761