Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting

被引:0
|
作者
Wu, Haixu [1 ]
Xu, Jiehui [1 ]
Wang, Jianmin [1 ]
Long, Mingsheng [1 ]
机构
[1] Tsinghua Univ, BNRist, Sch Software, Beijing, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) | 2021年 / 34卷
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extending the forecasting time is a critical demand for real applications, such as extreme weather early warning and long-term energy consumption planning. This paper studies the long-term forecasting problem of time series. Prior Transformer-based models adopt various self-attention mechanisms to discover the long-range dependencies. However, intricate temporal patterns of the long-term future prohibit the model from finding reliable dependencies. Also, Transformers have to adopt the sparse versions of point-wise self-attentions for long series efficiency, resulting in the information utilization bottleneck. Going beyond Transformers, we design Autoformer as a novel decomposition architecture with an Auto-Correlation mechanism. We break with the pre-processing convention of series decomposition and renovate it as a basic inner block of deep models. This design empowers Autoformer with progressive decomposition capacities for complex time series. Further, inspired by the stochastic process theory, we design the Auto-Correlation mechanism based on the series periodicity, which conducts the dependencies discovery and representation aggregation at the sub-series level. Auto-Correlation outperforms self-attention in both efficiency and accuracy. In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a 38% relative improvement on six benchmarks, covering five practical applications: energy, traffic, economics, weather and disease. Code is available at this repository: https://github.com/thuml/Autoformer.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] CLformer: Locally grouped auto-correlation and convolutional transformer for long-term multivariate time series forecasting
    Wang, Xingyu
    Liu, Hui
    Du, Junzhao
    Yang, Zhihan
    Dong, Xiyao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 121
  • [2] AD-autoformer: decomposition transformers with attention distilling for long sequence time-series forecasting
    Cao, Danyang
    Zhang, Shuai
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (14): : 21128 - 21148
  • [3] InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting
    Cao, Haizhou
    Huang, Zhenhao
    Yao, Tiechui
    Wang, Jue
    He, Hui
    Wang, Yangang
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 6906 - 6915
  • [4] A Long-Term Traffic Flow Prediction Model Based on Variational Mode Decomposition and Auto-Correlation Mechanism
    Guo, Kaixin
    Yu, Xin
    Liu, Gaoxiang
    Tang, Shaohu
    APPLIED SCIENCES-BASEL, 2023, 13 (12):
  • [5] ESSformer: Transformers with ESS Attention for Long-Term Series Forecasting
    Wu, Siyu
    Xiong, Kai
    Yu, Feiyang
    Pan, Xiyu
    Li, Jianjun
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VI, 2024, 15021 : 221 - 234
  • [8] A Long-term Time Series Forecasting method with Multiple Decomposition
    Wang, Yang
    Xhen, Xu
    Wang, Shuyang
    Jing, Yongjun
    35TH INTERNATIONAL CONFERENCE ON SCIENTIFIC AND STATISTICAL DATABASE MANAGEMENT, SSDBM 2023, 2023,
  • [9] Robformer: A robust decomposition transformer for long-term time series forecasting
    Yu, Yang
    Ma, Ruizhe
    Ma, Zongmin
    PATTERN RECOGNITION, 2024, 153
  • [10] Why the long-term auto-correlation has not been eliminated by arbitragers: Evidences from NYMEX
    Li, Daye
    Nishimura, Yusaku
    Men, Ming
    ENERGY ECONOMICS, 2016, 59 : 167 - 178