Informer model with season-aware block for efficient long-term power time series forecasting

被引:14
作者
Cui, Yunlong [1 ]
Li, Zhao [1 ]
Wang, Yusong [1 ]
Dong, Danhuang [2 ]
Gu, Chenlin [2 ]
Lou, Xiaowei [1 ]
Zhang, Peng [1 ]
机构
[1] Zhejiang Univ, Hangzhou, Peoples R China
[2] State Grid Zhejiang Elect Power Co Ltd, Econ & Technol Res Inst, Hangzhou, Peoples R China
关键词
LSTF; Self-attention; Data mining; Temporal covariates;
D O I
10.1016/j.compeleceng.2024.109492
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
With the development of electricity spot markets, accurate electricity load forecasting enables power generation companies to supply the right amount of electricity, greatly avoiding power waste. As a result, time series forecasting in the field of power can bring great benefits. Previously, the Informer model successfully introduced the Transformer into long time series forecasting(LTSF) by proposing the ProbSparse self-attention mechanism, which solved the inherent problem of high memory complexity in self-attention. Recent research has further demonstrated the potential of the self-attention for mining complex dependencies. However, the limited amount of historical data has become one of the main challenges in applying deep learning techniques to power LSTF tasks. Previous researches often add a large number of time covariates to provide more information. In this paper, to address this issue, (i) we design a simple but effective Seasonaware Block to enhance the model's ability to mine artificial prior information in temporal covariates; (ii) we conduct experiments using the provincial power data of Zhejiang Province, China, from 2019 to 2022, and our model outperforms other models, achieving a 19 percent MSE relative improvement, which can help power generation companies reduce costs and further reduce prices to benefit consumers; (iii) we conduct ablation experiments to assess the efficacy of the Season-aware Block in extracting temporal periodic features. Furthermore, we elucidate the underlying reasons for the effectiveness of both the self-attention mechanism and the Seasonaware Block through visualization experiments through visualization techniques.
引用
收藏
页数:11
相关论文
共 20 条
[1]   Stock Price Prediction Using the ARIMA Model [J].
Adebiyi, Ayodele A. ;
Adewumi, Aderemi O. ;
Ayo, Charles K. .
2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION (UKSIM), 2014, :106-112
[2]  
Alexander K, 2020, P INT C LEARN REPR
[3]  
Ba J, 2014, ACS SYM SER
[4]  
Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
[5]  
Bai S., 2018, ARXIV PREPRINT ARXIV
[6]  
Brown TB, 2020, ADV NEURAL INFORM PR, DOI [DOI 10.18653/V1/2021.MRL-1.1, DOI 10.48550/ARXIV.2005.14165]
[7]  
Cheng Heng- Tze, 2016, P 1 WORKSHOP DEEP LE, P7
[8]  
Cho K, 2014, P SSST 8 8 WORKSH SY, P103, DOI [10.3115/v1/W14-4012, DOI 10.3115/V1/W14-4012]
[9]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[10]  
Isard M, 2003, PROC CVPR IEEE, P613