A Dynamic pricing demand response algorithm for smart grid: Reinforcement learning approach

被引:295
作者
Lu, Renzhi [1 ]
Hong, Seung Ho [1 ]
Zhang, Xiongfeng [1 ]
机构
[1] Hanyang Univ, Dept Elect Engn, Ansan 15588, South Korea
基金
新加坡国家研究基金会;
关键词
Demand response; Dynamic pricing; Artificial intelligence; Reinforcement learning; Markov decision process; Q-learning; ENERGY MANAGEMENT SCHEME; ELECTRICITY DEMAND; ENVIRONMENT; MARKET; LOADS; MODEL; HOME;
D O I
10.1016/j.apenergy.2018.03.072
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
With the modern advanced information and communication technologies in smart grid systems, demand response (DR) has become an effective method for improving grid reliability and reducing energy costs due to the ability to react quickly to supply-demand mismatches by adjusting flexible loads on the demand side. This paper proposes a dynamic pricing DR algorithm for energy management in a hierarchical electricity market that considers both service provider's (SP) profit and customers' (CUs) costs. Reinforcement learning (RL) is used to illustrate the hierarchical decision-making framework, in which the dynamic pricing problem is formulated as a discrete finite Markov decision process (MDP), and Q-learning is adopted to solve this decision-making problem. Using RL, the SP can adaptively decide the retail electricity price during the on-line learning process where the uncertainty of CUs' load demand profiles and the flexibility of wholesale electricity prices are addressed. Simulation results show that this proposed DR algorithm, can promote SP profitability, reduce energy costs for CUs, balance energy supply and demand in the electricity market, and improve the reliability of electric power systems, which can be regarded as a win-win strategy for both SP and CUs.
引用
收藏
页码:220 / 230
页数:11
相关论文
共 54 条
[1]  
[Anonymous], 2001, Institute Of Systems and Robotics, Tech. Rep
[2]  
[Anonymous], 2017, ADV BUILD ENERGY RES
[3]  
[Anonymous], ARXIV13125602
[4]  
[Anonymous], 1994, Advances in Neural Information Processing Systems
[5]  
[Anonymous], 2006, TECH REP
[6]   Transactive control of fast-acting demand response based on thermostatic loads in real-time retail electricity markets [J].
Behboodi, Sahand ;
Chassin, David P. ;
Djilali, Ned ;
Crawford, Curran .
APPLIED ENERGY, 2018, 210 :1310-1320
[7]   Reinforcement Learning-Based Plug-in Electric Vehicle Charging With Forecasted Price [J].
Chis, Adriana ;
Lunden, Jarmo ;
Koivunen, Visa .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2017, 66 (05) :3674-3684
[8]   An integrated model for assessing electricity retailer's profitability with demand response [J].
Dagoumas, Athanasios S. ;
Polemis, Michael L. .
APPLIED ENERGY, 2017, 198 :49-64
[9]   A Demand Response Energy Management Scheme for Industrial Facilities in Smart Grid [J].
Ding, Yue Min ;
Hong, Seung Ho ;
Li, Xiao Hui .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2014, 10 (04) :2257-2269
[10]   Demand response in electrical energy supply: An optimal real time pricing approach [J].
Faria, P. ;
Vale, Z. .
ENERGY, 2011, 36 (08) :5374-5384