An application of reinforcement learning to residential energy storage under real-time pricing

被引:3
作者
Brock, Eli [1 ]
Bruckstein, Lauren [2 ]
Connor, Patrick [1 ]
Nguyen, Sabrina [1 ]
Kerestes, Robert [1 ]
Abdelhakim, Mai [1 ]
机构
[1] Univ Pittsburgh, Elect & Comp Engn, Pittsburgh, PA 15260 USA
[2] Univ Pittsburgh, Comp Sci, Pittsburgh, PA USA
来源
2021 IEEE PES INNOVATIVE SMART GRID TECHNOLOGIES - ASIA (ISGT ASIA) | 2021年
关键词
demand response; reinforcement learning; real-time pricing; energy storage; DEMAND RESPONSE; INDUSTRIAL CUSTOMERS; MARKET;
D O I
10.1109/ISGTAsia49270.2021.9715712
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the proliferation of advanced metering infrastructure (AMI), more real-time data is available to electric utilities and consumers. Such high volumes of data facilitate innovative electricity rate structures beyond flat-rate and time-of-use (TOU) tariffs. One such innovation is real-time pricing (RTP), in which the wholesale market-clearing price is passed directly to the consumer on an hour-by-hour basis. While rare, RTP exists in parts of the United States and has been observed to reduce electric bills. Although these reductions are largely incidental, RTP may represent an opportunity for large-scale peak shaving, demand response, and economic efficiency when paired with intelligent control systems. Algorithms controlling flexible loads and energy storage have been deployed for demand response elsewhere in the literature, but few studies have investigated these algorithms in an RTP environment. If properly optimized, the dynamic between RTP and intelligent control has the potential to counteract the unwelcome spikes and dips of demand driven by growing penetration of distributed renewable generation and electric vehicles (EV). This paper presents a simple reinforcement learning (RL) application for optimal battery control subject to an RTP signal.
引用
收藏
页数:5
相关论文
共 23 条
[1]   Rethinking real-time electricity pricing [J].
Allcott, Hunt .
RESOURCE AND ENERGY ECONOMICS, 2011, 33 (04) :820-842
[2]  
[Anonymous], 2009, GROUP
[3]   The long-run efficiency of real-time electricity pricing [J].
Borenstein, S .
ENERGY JOURNAL, 2005, 26 (03) :93-116
[4]   Time-of-Use Pricing in Electricity Markets Under Different Market Structures [J].
Celebi, Emre ;
Fuller, J. David .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2012, 27 (03) :1170-1181
[5]   Reinforcement Learning-Based Plug-in Electric Vehicle Charging With Forecasted Price [J].
Chis, Adriana ;
Lunden, Jarmo ;
Koivunen, Visa .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2017, 66 (05) :3674-3684
[6]  
Danping Ren, 2011, 2011 IEEE Online Conference on Green Communications, P1, DOI 10.1109/GreenCom.2011.6082525
[7]   A Survey on Demand Response in Smart Grids: Mathematical Models and Approaches [J].
Deng, Ruilong ;
Yang, Zaiyue ;
Chow, Mo-Yuen ;
Chen, Jiming .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2015, 11 (03) :570-582
[8]   Smart Household Operation Considering Bi-Directional EV and ESS Utilization by Real-Time Pricing-Based DR [J].
Erdinc, Ozan ;
Paterakis, Nikolaos G. ;
Mendes, Tiago D. P. ;
Bakirtzis, Anastasios G. ;
Catalao, Joao P. S. .
IEEE TRANSACTIONS ON SMART GRID, 2015, 6 (03) :1281-1291
[9]   Modeling the Merit Order Curve of the European Energy Exchange Power Market in Germany [J].
He, Yang ;
Hildmann, Marcus ;
Herzog, Florian ;
Andersson, Goeran .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2013, 28 (03) :3155-3164
[10]   A Perspective on Reinforcement Learning in Price-based Demand Response for Smart Grid [J].
Lu, Renzhi ;
Hong, Seung Ho ;
Zhang, Xiongfeng ;
Ye, Xun ;
Song, Won Seok .
PROCEEDINGS 2017 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI), 2017, :1822-1823