WDFormer: Acceptable Field Growth Transformer with Windows for Long Sequence Time-Series Forecasting

被引:0
作者
Yang, Jinghong [1 ]
Yu, Qing [1 ]
机构
[1] Tianjin Univ Technol, Tianjin Key Lab Intelligence Comp & Novel Softwar, Tianjin, Peoples R China
来源
PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024 | 2024年
关键词
Transformer; Long Sequence Time-Series Forecasting; CNN;
D O I
10.1109/CSCWD61410.2024.10580223
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Time-series forecasting has a wide range of applications in areas such as financial market forecasting, weather forecasting, and sales forecasting. Recent studies have shown that Transformers have advantages in handling such problems, especially Long Sequence Time-Series Forecasting (LSTF) problems. When dealing with long sequence inputs in time-series data, transformers primarily focus on improving the attention mechanism. However, they do not address the underlying issue of the sequence length during attention computation. Many transformer models are loosely coupled with cellular neural network(CNN), which fail to fully exploit the local feature extraction capabilities. Therefore, we propose WDFormer for Long Sequence Time-Series Forecasting (LSTF), which introduces a local-global windowing mechanism to deal with long sequence time-series and proposes to use dilated causal convolutional layers instead of canonical convolutional layers to obtain exponential acceptable field growth at a slightly negligible computational cost to improve the long sequence forecasting accuracy. Our extensive experiments on four large datasets show that the WDFormer algorithm performs better than most baseline models on the LSTF problem.
引用
收藏
页码:459 / 464
页数:6
相关论文
共 50 条
  • [41] DEformer: Dual Embedded Transformer for Multivariate Time Series Forecasting
    Kim, Minje
    Lee, Suwon
    Choi, Sang-Min
    IEEE ACCESS, 2024, 12 : 153851 - 153858
  • [42] Deep Learning Models for Time-Series Forecasting of RF-EMF in Wireless Networks
    Nguyen, Chi
    Cheema, Adnan Ahmad
    Kurnaz, Cetin
    Rahimian, Ardavan
    Brennan, Conor
    Duong, Trung Q.
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 1399 - 1414
  • [43] Multi-scale convolution enhanced transformer for multivariate long-term time series forecasting
    Li, Ao
    Li, Ying
    Xu, Yunyang
    Li, Xuemei
    Zhang, Caiming
    NEURAL NETWORKS, 2024, 180
  • [44] A Joint Time-Frequency Domain Transformer for multivariate time series forecasting
    Chen, Yushu
    Liu, Shengzhuo
    Yang, Jinzhe
    Jing, Hao
    Zhao, Wenlai
    Yang, Guangwen
    NEURAL NETWORKS, 2024, 176
  • [45] NFIG-X: Nonlinear Fuzzy Information Granule Series for Long-Term Traffic Flow Time-Series Forecasting
    Cheng, Yue
    Xing, Weiwei
    Pedrycz, Witold
    Xian, Sidong
    Liu, Weibin
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2023, 31 (10) : 3582 - 3597
  • [46] TransDBC: Transformer for Multivariate Time-Series based Driver Behavior Classification
    Vyas, Jayant
    Bhardwaj, Nishit
    Bhumika
    Das, Debasis
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [47] DFCNformer: A Transformer Framework for Non-Stationary Time-Series Forecasting Based on De-Stationary Fourier and Coefficient Network
    Jin, Yuxin
    Mao, Yuhan
    Chen, Genlang
    INFORMATION, 2025, 16 (01)
  • [48] Gaussian-Linearized Transformer with Tranquilized Time-Series Decomposition Methods for Fault Diagnosis and Forecasting of Methane Gas Sensor Arrays
    Zhang, Kai
    Ning, Wangze
    Zhu, Yudi
    Li, Zhuoheng
    Wang, Tao
    Jiang, Wenkai
    Zeng, Min
    Yang, Zhi
    APPLIED SCIENCES-BASEL, 2024, 14 (01):
  • [49] GRAformer: A gated residual attention transformer for multivariate time series forecasting
    Yang, Chengcao
    Wang, Yutian
    Yang, Bing
    Chen, Jun
    NEUROCOMPUTING, 2024, 581
  • [50] Persistence Initialization: a novel adaptation of the Transformer architecture for time series forecasting
    Haugsdal, Espen
    Aune, Erlend
    Ruocco, Massimiliano
    APPLIED INTELLIGENCE, 2023, 53 (22) : 26781 - 26796