WDFormer: Acceptable Field Growth Transformer with Windows for Long Sequence Time-Series Forecasting

被引:0
作者
Yang, Jinghong [1 ]
Yu, Qing [1 ]
机构
[1] Tianjin Univ Technol, Tianjin Key Lab Intelligence Comp & Novel Softwar, Tianjin, Peoples R China
来源
PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024 | 2024年
关键词
Transformer; Long Sequence Time-Series Forecasting; CNN;
D O I
10.1109/CSCWD61410.2024.10580223
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Time-series forecasting has a wide range of applications in areas such as financial market forecasting, weather forecasting, and sales forecasting. Recent studies have shown that Transformers have advantages in handling such problems, especially Long Sequence Time-Series Forecasting (LSTF) problems. When dealing with long sequence inputs in time-series data, transformers primarily focus on improving the attention mechanism. However, they do not address the underlying issue of the sequence length during attention computation. Many transformer models are loosely coupled with cellular neural network(CNN), which fail to fully exploit the local feature extraction capabilities. Therefore, we propose WDFormer for Long Sequence Time-Series Forecasting (LSTF), which introduces a local-global windowing mechanism to deal with long sequence time-series and proposes to use dilated causal convolutional layers instead of canonical convolutional layers to obtain exponential acceptable field growth at a slightly negligible computational cost to improve the long sequence forecasting accuracy. Our extensive experiments on four large datasets show that the WDFormer algorithm performs better than most baseline models on the LSTF problem.
引用
收藏
页码:459 / 464
页数:6
相关论文
共 50 条
  • [1] RSMformer: an efficient multiscale transformer-based framework for long sequence time-series forecasting
    Guoxiang Tong
    Zhaoyuan Ge
    Dunlu Peng
    Applied Intelligence, 2024, 54 (2) : 1275 - 1296
  • [2] RSMformer: an efficient multiscale transformer-based framework for long sequence time-series forecasting
    Tong, Guoxiang
    Ge, Zhaoyuan
    Peng, Dunlu
    APPLIED INTELLIGENCE, 2024, 54 (02) : 1275 - 1296
  • [3] AGCNT: Adaptive Graph Convolutional Network for Transformer-based Long Sequence Time-Series Forecasting
    Su, Hongyang
    Wang, Xiaolong
    Qin, Yang
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3439 - 3442
  • [4] Long sequence time-series forecasting with deep learning: A survey
    Chen, Zonglei
    Ma, Minbo
    Li, Tianrui
    Wang, Hongjun
    Li, Chongshou
    INFORMATION FUSION, 2023, 97
  • [5] Graphformer: Adaptive graph correlation transformer for multivariate long sequence time series forecasting
    Wang, Yijie
    Long, Hao
    Zheng, Linjiang
    Shang, Jiaxing
    KNOWLEDGE-BASED SYSTEMS, 2024, 285
  • [6] Double-Layer Attention for Long Sequence Time-Series Forecasting
    Ma, Jiasheng
    Wan, Xiaoye
    Xiao, Yingyuan
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2023, PT II, 2023, 14147 : 230 - 244
  • [7] Time-series forecasting of mortality rates using transformer
    Wang, Jun
    Wen, Lihong
    Xiao, Lu
    Wang, Chaojie
    SCANDINAVIAN ACTUARIAL JOURNAL, 2024, 2024 (02) : 109 - 123
  • [8] Multivariate long sequence time-series forecasting using dynamic graph learning
    Wang X.
    Wang Y.
    Peng J.
    Zhang Z.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (06) : 7679 - 7693
  • [9] Historical Inertia: A Neglected but Powerful Baseline for Long Sequence Time-series Forecasting
    Cui, Yue
    Xie, Jiandong
    Zheng, Kai
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2965 - 2969
  • [10] TS-Fastformer: Fast Transformer for Time-series Forecasting
    Lee, Sangwon
    Hong, Junho
    Liu, Ling
    Choi, Wonik
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2024, 15 (02) : 1 - 20