Regression model-based hourly aggregated electricity demand prediction

被引:3
作者
Panigrahi, Radharani [1 ]
Patne, Nita R. [1 ]
Pemmada, Sumanth [1 ]
Manchalwar, Ashwini D. [1 ]
机构
[1] Visvesvaraya Natl Inst Technol, Nagpur 440010, India
关键词
CatBoost; Electricity demand prediction; Gradient boosting; Machine learning; Overfitting;
D O I
10.1016/j.egyr.2022.10.004
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
The ability to predictggregated electricity demand ofn electrical grid on an hourly basis is crucial for energy and demand management. In this study, demand and its categorical features data for three years are segregated into four seasons and then fed to an efficient Machine Learning Categorical Boosting (ML CatBoost) Regressor model to predict the next year's demand. It uses a new gradient boosting algorithm that handles categorical features adeptly. Also, this model uses a new scheme for estimating leaf values while choosing tree structure which reduces overfitting. Further, hourly electricity demand data from the Electricity Reliability Council of Texas (ERCOT western) is used as the benchmark data to evaluate the CatBoost model. Moreover, five other ML models were developed, analyzed, and tested for the same ERCOT data for predicting the hourly aggregated electricity demand. The suggested model is compared with the long short-term memory neural network (LSTM-NN) and five other ML models in terms of performance evaluation matrices. Here, one additional performance evaluation parameter, the Coefficient of variation Root Mean Squared (CV-RMSE) is evaluated in addition to the benchmark paper's parameters. In addition, the importance of accurate prediction in the electric grid for clean energy is discussed. (c) 2022 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页码:16 / 24
页数:9
相关论文
共 26 条
[1]   Boosting as a kernel-based method [J].
Aravkin, Aleksandr Y. ;
Bottegal, Giulio ;
Pillonetto, Gianluigi .
MACHINE LEARNING, 2019, 108 (11) :1951-1974
[2]   A gradient boosting approach to the Kaggle load forecasting competition [J].
Ben Taieb, Souhaib ;
Hyndman, Rob J. .
INTERNATIONAL JOURNAL OF FORECASTING, 2014, 30 (02) :382-394
[3]  
Cestnik B., 1990, ECAI 90. Proceedings of the 9th European Conference on Artificial Intelligence, P147
[4]   An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization [J].
Dietterich, TG .
MACHINE LEARNING, 2000, 40 (02) :139-157
[5]  
ERCOT, US
[6]   Cooling load prediction and optimal operation of HVAC systems using a multiple nonlinear regression model [J].
Fan, Chengliang ;
Ding, Yunfei .
ENERGY AND BUILDINGS, 2019, 197 :7-17
[7]   A decision-theoretic generalization of on-line learning and an application to boosting [J].
Freund, Y ;
Schapire, RE .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) :119-139
[8]   Greedy function approximation: A gradient boosting machine [J].
Friedman, JH .
ANNALS OF STATISTICS, 2001, 29 (05) :1189-1232
[9]   Stochastic gradient boosting [J].
Friedman, JH .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2002, 38 (04) :367-378
[10]  
Guideline ASHARE, 2002, MEAS EN DEM SAV