WDFormer: Acceptable Field Growth Transformer with Windows for Long Sequence Time-Series Forecasting

被引:0
作者
Yang, Jinghong [1 ]
Yu, Qing [1 ]
机构
[1] Tianjin Univ Technol, Tianjin Key Lab Intelligence Comp & Novel Softwar, Tianjin, Peoples R China
来源
PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024 | 2024年
关键词
Transformer; Long Sequence Time-Series Forecasting; CNN;
D O I
10.1109/CSCWD61410.2024.10580223
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Time-series forecasting has a wide range of applications in areas such as financial market forecasting, weather forecasting, and sales forecasting. Recent studies have shown that Transformers have advantages in handling such problems, especially Long Sequence Time-Series Forecasting (LSTF) problems. When dealing with long sequence inputs in time-series data, transformers primarily focus on improving the attention mechanism. However, they do not address the underlying issue of the sequence length during attention computation. Many transformer models are loosely coupled with cellular neural network(CNN), which fail to fully exploit the local feature extraction capabilities. Therefore, we propose WDFormer for Long Sequence Time-Series Forecasting (LSTF), which introduces a local-global windowing mechanism to deal with long sequence time-series and proposes to use dilated causal convolutional layers instead of canonical convolutional layers to obtain exponential acceptable field growth at a slightly negligible computational cost to improve the long sequence forecasting accuracy. Our extensive experiments on four large datasets show that the WDFormer algorithm performs better than most baseline models on the LSTF problem.
引用
收藏
页码:459 / 464
页数:6
相关论文
共 50 条
  • [21] TIformer: A Transformer-Based Framework for Time-Series Forecasting with Missing Data
    Ding, Zuocheng
    Chen, Yufan
    Wang, Hanchen
    Wang, Xiaoyang
    Zhang, Wenjie
    Zhang, Ying
    DATABASES THEORY AND APPLICATIONS, ADC 2024, 2025, 15449 : 71 - 84
  • [22] Time-Series Power Forecasting for Wind and Solar Energy Based on the SL-Transformer
    Zhu, Jian
    Zhao, Zhiyuan
    Zheng, Xiaoran
    An, Zhao
    Guo, Qingwu
    Li, Zhikai
    Sun, Jianling
    Guo, Yuanjun
    ENERGIES, 2023, 16 (22)
  • [23] RGANFormer: Relativistic Generative Adversarial Transformer for Time-Series Signal Forecasting on Intelligent Vehicles
    Xing, Yang
    Kong, Xiangqi
    Tsourdos, Antonios
    2024 35TH IEEE INTELLIGENT VEHICLES SYMPOSIUM, IEEE IV 2024, 2024, : 3241 - 3247
  • [24] InSAR time-series deformation forecasting surrounding Salt Lake using deep transformer models
    Wang, Jing
    Li, Chao
    Li, Lu
    Huang, Zhihua
    Wang, Chao
    Zhang, Hong
    Zhang, Zhengjia
    SCIENCE OF THE TOTAL ENVIRONMENT, 2023, 858
  • [25] Robformer: A robust decomposition transformer for long-term time series forecasting
    Yu, Yang
    Ma, Ruizhe
    Ma, Zongmin
    PATTERN RECOGNITION, 2024, 153
  • [26] SageFormer: Series-Aware Framework for Long-Term Multivariate Time-Series Forecasting
    Zhang, Zhenwei
    Meng, Linghang
    Gu, Yuantao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 18435 - 18448
  • [27] Long-term forecasting using transformer based on multiple time series
    Lee, Jaeyong
    Kim, Hyun Jun
    Lim, Changwon
    KOREAN JOURNAL OF APPLIED STATISTICS, 2024, 37 (05) : 583 - 598
  • [28] ENHANCING NUCLEAR POWER PLANT OPERATIONAL FORECASTING with TRANSFORMER NEURAL NETWORKS: A TIME-SERIES DATA APPROACH
    Tuo, Yanjie
    Liu, Xiaojing
    PROCEEDINGS OF 2024 31ST INTERNATIONAL CONFERENCE ON NUCLEAR ENGINEERING, VOL 1, ICONE31 2024, 2024,
  • [29] FEDAF: frequency enhanced decomposed attention free transformer for long time series forecasting
    Yang X.
    Li H.
    Huang X.
    Feng X.
    Neural Computing and Applications, 2024, 36 (26) : 16271 - 16288
  • [30] TCCT: Tightly-coupled convolutional transformer on time series forecasting
    Shen, Li
    Wang, Yangzhu
    NEUROCOMPUTING, 2022, 480 : 131 - 145