Model-Free Real-Time Autonomous Control for a Residential Multi-Energy System Using Deep Reinforcement Learning

被引:156
作者
Ye, Yujian [1 ,2 ]
Qiu, Dawei [1 ]
Wu, Xiaodong [3 ]
Strbac, Goran [1 ]
Ward, Jonathan [4 ]
机构
[1] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2AZ, England
[2] Fetch Ai, Cambridge CB4 0WS, England
[3] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
[4] Fetch Ai, Res, Cambridge, England
基金
英国工程与自然科学研究理事会;
关键词
Energy management; Uncertainty; Resistance heating; Smart grids; Real-time systems; Schedules; Deep neural network; deep reinforcement learning; energy management system; multi-energy system; smart grid; ENERGY-STORAGE SYSTEMS; DEMAND RESPONSE; STOCHASTIC OPTIMIZATION; OPTIMAL OPERATION; HUB; MANAGEMENT;
D O I
10.1109/TSG.2020.2976771
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Multi-energy systems (MES) are attracting increasing attention driven by its potential to offer significant flexibility in future smart grids. At the residential level, the roll-out of smart meters and rapid deployment of smart energy devices call for autonomous multi-energy management systems which can exploit real-time information to optimally schedule the usage of different devices with the aim of minimizing end-users' energy costs. This paper proposes a novel real-time autonomous energy management strategy for a residential MES using a model-free deep reinforcement learning (DRL) based approach, combining state-of-the-art deep deterministic policy gradient (DDPG) method with an innovative prioritized experience replay strategy. This approach is tailored to align with the nature of the problem by posing it in multi-dimensional continuous state and action spaces, facilitating more cost-effective control strategies to be devised. The superior performance of the proposed approach in reducing end-user's energy cost while coping with the MES uncertainties is demonstrated by comparing it against state-of-the-art DRL methods as well as conventional stochastic programming and robust optimization methods in numerous case studies in a real-world scenario.
引用
收藏
页码:3068 / 3082
页数:15
相关论文
共 57 条
[41]   Coordinated Scheduling of Residential Distributed Energy Resources to Optimize Smart Home Energy Services [J].
Pedrasa, Michael Angelo A. ;
Spooner, Ted D. ;
MacGill, Iain F. .
IEEE TRANSACTIONS ON SMART GRID, 2010, 1 (02) :134-143
[42]   Whole-Systems Assessment of the Value of Energy Storage in Low-Carbon Electricity Systems [J].
Pudjianto, Danny ;
Aunedi, Marko ;
Djapic, Predrag ;
Strbac, Goran .
IEEE TRANSACTIONS ON SMART GRID, 2014, 5 (02) :1098-1109
[43]   Load management in a residential energy hub with renewable distributed energy resources [J].
Rastegar, Mohammad ;
Fotuhi-Firuzabad, Mahmud .
ENERGY AND BUILDINGS, 2015, 107 :234-242
[44]   Residential Load Scheduling With Renewable Generation in the Smart Grid: A Reinforcement Learning Approach [J].
Remani, T. ;
Jasmin, E. A. ;
Ahamed, T. P. Imthias .
IEEE SYSTEMS JOURNAL, 2019, 13 (03) :3283-3294
[45]   Reinforcement Learning Applied to an Electric Water Heater: From Theory to Practice [J].
Ruelens, F. ;
Claessens, B. J. ;
Quaiyum, S. ;
De Schutter, B. ;
Babuska, R. ;
Belmans, R. .
IEEE TRANSACTIONS ON SMART GRID, 2018, 9 (04) :3792-3800
[46]   Residential Demand Response of Thermostatically Controlled Loads Using Batch Reinforcement Learning [J].
Ruelens, Frederik ;
Claessens, Bert J. ;
Vandael, Stijn ;
De Schutter, Bart ;
Babuska, Robert ;
Belmans, Ronnie .
IEEE TRANSACTIONS ON SMART GRID, 2017, 8 (05) :2149-2159
[47]  
Schaul T, 2016, CORR
[48]   Stochastic multi-objective energy management in residential microgrids with combined cooling, heating, and power units considering battery energy storage systems and plug-in hybrid electric vehicles [J].
Sedighizadeh, Mostafa ;
Esmaili, Masoud ;
Mohammadkhani, Nahid .
JOURNAL OF CLEANER PRODUCTION, 2018, 195 :301-317
[49]   Stochastic optimization of energy hub operation with consideration of thermal energy market and demand response [J].
Vahid-Pakdel, M. J. ;
Nojavan, Sayyad ;
Mohammadi-ivatloo, B. ;
Zare, Kazem .
ENERGY CONVERSION AND MANAGEMENT, 2017, 145 :117-128
[50]   Model-Free Real-Time EV Charging Scheduling Based on Deep Reinforcement Learning [J].
Wan, Zhiqiang ;
Li, Hepeng ;
He, Haibo ;
Prokhorov, Danil .
IEEE TRANSACTIONS ON SMART GRID, 2019, 10 (05) :5246-5257