WDFormer: Acceptable Field Growth Transformer with Windows for Long Sequence Time-Series Forecasting

被引:0
作者
Yang, Jinghong [1 ]
Yu, Qing [1 ]
机构
[1] Tianjin Univ Technol, Tianjin Key Lab Intelligence Comp & Novel Softwar, Tianjin, Peoples R China
来源
PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024 | 2024年
关键词
Transformer; Long Sequence Time-Series Forecasting; CNN;
D O I
10.1109/CSCWD61410.2024.10580223
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Time-series forecasting has a wide range of applications in areas such as financial market forecasting, weather forecasting, and sales forecasting. Recent studies have shown that Transformers have advantages in handling such problems, especially Long Sequence Time-Series Forecasting (LSTF) problems. When dealing with long sequence inputs in time-series data, transformers primarily focus on improving the attention mechanism. However, they do not address the underlying issue of the sequence length during attention computation. Many transformer models are loosely coupled with cellular neural network(CNN), which fail to fully exploit the local feature extraction capabilities. Therefore, we propose WDFormer for Long Sequence Time-Series Forecasting (LSTF), which introduces a local-global windowing mechanism to deal with long sequence time-series and proposes to use dilated causal convolutional layers instead of canonical convolutional layers to obtain exponential acceptable field growth at a slightly negligible computational cost to improve the long sequence forecasting accuracy. Our extensive experiments on four large datasets show that the WDFormer algorithm performs better than most baseline models on the LSTF problem.
引用
收藏
页码:459 / 464
页数:6
相关论文
共 50 条
  • [31] Transformer-Based Time-Series Forecasting for Telemetry Data in an Environmental Control and Life Support System of Spacecraft
    Song, Bin
    Guo, Boyu
    Hu, Wei
    Zhang, Zhen
    Zhang, Nan
    Bao, Junpeng
    Wang, Jianji
    Xin, Jingmin
    ELECTRONICS, 2025, 14 (03):
  • [32] MSDformer: an autocorrelation transformer with multiscale decomposition for long-term multivariate time series forecasting
    Su, Guangyao
    Guan, Yepeng
    APPLIED INTELLIGENCE, 2025, 55 (02)
  • [33] Physically-guided temporal diffusion transformer for long-term time series forecasting
    Ren, Zeqi
    Yu, Jianbo
    Huang, Jian
    Yang, Xiaofeng
    Leng, Siyang
    Liu, Yuping
    Yan, Shifu
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [34] PETformer: Long-Term Time Series Forecasting via Placeholder-Enhanced Transformer
    Lin, Shengsheng
    Lin, Weiwei
    Wu, Wentai
    Wang, Songbo
    Wang, Yongxiang
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024,
  • [35] Smartformer: An intelligent transformer compression framework for time-series modeling
    Wang, Xiaojian
    Wang, Yinan
    Yang, Jin
    Chen, Ying
    IISE TRANSACTIONS, 2024,
  • [36] PWDformer: Deformable transformer for long-term series forecasting
    Wang, Zheng
    Ran, Haowei
    Ren, Jinchang
    Sun, Meijun
    PATTERN RECOGNITION, 2024, 147
  • [37] AMSFormer: A transformer with adaptive multi-scale partitioning and multi-level spectral filtering for time-series forecasting
    Liu, Honghao
    Diao, Yining
    Sun, Ke
    Wan, Zhaolin
    Li, Zhiyang
    NEUROCOMPUTING, 2025, 637
  • [38] Forecasting Time-Series Trends by Merging Structured and Unstructured Datasets
    Park, Ji Sang
    Cho, Hyeon Sung
    Lee, Ji Sung
    Chung, Kyo-Il
    Kim, Jeong Min
    Kim, Dong Jin
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1230 - 1233
  • [39] Transformer training strategies for forecasting multiple load time series
    Hertel M.
    Beichter M.
    Heidrich B.
    Neumann O.
    Schäfer B.
    Mikut R.
    Hagenmeyer V.
    Energy Informatics, 6 (Suppl 1)
  • [40] Modality-aware Transformer for Financial Time series Forecasting
    Gohari, Hajar Emami
    Dang, Xuan-Hong
    Shah, Syed Yousaf
    Zerfos, Petros
    5TH ACM INTERNATIONAL CONFERENCE ON AI IN FINANCE, ICAIF 2024, 2024, : 677 - 685