An efficient deep model for day-ahead electricity load forecasting with stacked denoising auto-encoders

被引:84
|
作者
Tong, Chao [1 ]
Li, Jun [1 ]
Lang, Chao [1 ]
Kong, Fanxin [2 ]
Niu, Jianwei [1 ]
Rodrigues, Joel J. P. C. [3 ,4 ,5 ,6 ]
机构
[1] Beihang Univ, Sch Comp Sci & Engn, Beijing 100191, Peoples R China
[2] McGill Univ, Sch Comp Sci, Montreal, PQ H3A 2T5, Canada
[3] Natl Inst Telecommun Inatel, BR-37540000 Santa Rita Do Sapucai, MG, Brazil
[4] Inst Telecomunicacoes, P-6201001 Covilha, Portugal
[5] Univ Fortaleza UNIFOR, BR-60811905 Fortaleza, Ceara, Brazil
[6] ITMO Univ, St Petersburg 191002, Russia
基金
中国国家自然科学基金;
关键词
Deep learning; Multi-modal; Stacked denoising auto-encoders; Feature extraction; Support vector regression; NETWORK;
D O I
10.1016/j.jpdc.2017.06.007
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In real word it is quite meaningful to forecast the day-ahead electricity load for an area, which is beneficial to reduction of electricity waste and rational arrangement of electric generator units. The deployment of various sensors strongly pushes this forecasting research into a "big data" era for a huge amount of information has been accumulated. Meanwhile the prosperous development of deep learning (DL) theory provides powerful tools to handle massive data and often outperforms conventional machine learning methods in many traditional fields. Inspired by these, we propose a deep learning based model which firstly refines features by stacked denoising auto-encoders (SDAs) from history electricity load data and related temperature parameters, subsequently trains a support vector regression (SVR) model to forecast the day-ahead total electricity load. The most significant contribution of this heterogeneous deep model is that the abstract features extracted by SADs from original electricity load data are proven to describe and forecast the load tendency more accurately with lower errors. We evaluate this proposed model by comparing with plain SVR and artificial neural networks (ANNs) models, and the experimental results validate its performance improvements. (C) 2017 Elsevier Inc. All rights reserved.
引用
收藏
页码:267 / 273
页数:7
相关论文
共 50 条
  • [1] In Day-Ahead Electricity Load Forecasting
    Klempka, Ryszard
    Swiatek, Boguslaw
    2009 10TH INTERNATIONAL CONFERENCE ON ELECTRICAL POWER QUALITY AND UTILISATION (EPQU 2009), 2009, : 313 - 317
  • [2] Complete Stacked Denoising Auto-Encoders for Regression
    María-Elena Fernández-García
    José-Luis Sancho-Gómez
    Antonio Ros-Ros
    Aníbal R. Figueiras-Vidal
    Neural Processing Letters, 2021, 53 : 787 - 797
  • [3] Complete Stacked Denoising Auto-Encoders for Regression
    Fernandez-Garcia, Maria-Elena
    Sancho-Gomez, Jose-Luis
    Ros-Ros, Antonio
    Figueiras-Vidal, Anibal R.
    NEURAL PROCESSING LETTERS, 2021, 53 (01) : 787 - 797
  • [4] Stacked Denoising Auto-Encoders for Short-Term Time Series Forecasting
    Romeu, Pablo
    Zamora-Martinez, Francisco
    Botella-Rocamora, Paloma
    Pardo, Juan
    ARTIFICIAL NEURAL NETWORKS, 2015, : 463 - 486
  • [5] Forecasting quantiles of day-ahead electricity load
    Li, Z.
    Hurn, A. S.
    Clements, A. E.
    ENERGY ECONOMICS, 2017, 67 : 60 - 71
  • [6] Extracting and inserting knowledge into stacked denoising auto-encoders
    Yu, Jianbo
    Liu, Guoliang
    NEURAL NETWORKS, 2021, 137 : 31 - 42
  • [7] Stacked Convolutional Denoising Auto-Encoders for Feature Representation
    Du, Bo
    Xiong, Wei
    Wu, Jia
    Zhang, Lefei
    Zhang, Liangpei
    Tao, Dacheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (04) : 1017 - 1027
  • [8] Deep learning for day-ahead electricity price forecasting
    Zhang, Chi
    Li, Ran
    Shi, Heng
    Li, Furong
    IET SMART GRID, 2020, 3 (04) : 462 - 469
  • [9] Electric Load Data Compression and Classification Based on Deep Stacked Auto-Encoders
    Huang, Xiaoyao
    Hu, Tianbin
    Ye, Chengjin
    Xu, Guanhua
    Wang, Xiaojian
    Chen, Liangjin
    ENERGIES, 2019, 12 (04)
  • [10] The Day-Ahead Electricity Price Forecasting Based on Stacked CNN and LSTM
    Xie, Xiaolong
    Xu, Wei
    Tan, Hongzhi
    INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING, 2018, 11266 : 216 - 230