Reinforcement Learning Based Integrated Energy System Management: A Survey

被引:0
作者
Xiong L.-L. [1 ]
Mao S. [1 ]
Tang Y. [1 ]
Meng K. [2 ]
Dong Z.-Y. [2 ]
Qian F. [1 ]
机构
[1] School of Information Science and Engineering, East China University of Science and Technology, Shanghai
[2] School of Electrical Engineering and Telecommunications, University of New South Wales, 2052, NSW
来源
Zidonghua Xuebao/Acta Automatica Sinica | 2021年 / 47卷 / 10期
基金
中国国家自然科学基金;
关键词
Energy management; Integrated energy system (IES); Power system; Reinforcement learning (RL);
D O I
10.16383/j.aas.c210166
中图分类号
学科分类号
摘要
In order to meet the growing energy demand and reduce the damage to the environment, energy conservation has become a long-term strategic policy for global economic and social development. The enhancement of energy management can improve energy efficiency, as well as promote energy conservation and emission reduction. However, the integration of renewable energy and flexible load makes the integrated energy system (IES) become a complex dynamic system with high uncertainty, which brings great challenges to modern energy management. Reinforcement learning (RL), as a typical interactive trial-and-error learning method, is suitable for solving optimization problems of complex dynamic systems with uncertainty, and therefore it has been widely considered in integrated energy system management. This paper systematically reviews the existing works of using reinforcement learning to solve integrated energy system management problems from the perspective of models and algorithms, and puts forward prospects from four aspects: Multi-time scale, interpretability, transferability, and information security. Copyright © 2021 Acta Automatica Sinica. All rights reserved.
引用
收藏
页码:2321 / 2340
页数:19
相关论文
共 148 条
[1]  
Sun Qiu-Ye, Teng Fei, Zhang Hua-Guang, Energy internet and its key control issues, Acta Automatica Sinica, 43, 2, pp. 176-194, (2017)
[2]  
Yang T, Zhao L Y, Li W, Zomaya A Y., Reinforcement learning in sustainable energy and electric systems: A survey, Annual Reviews in Control, 49, pp. 145-163, (2020)
[3]  
Revolutionary strategy of energy production and consumption (2016−2030), China Electrical Equipment Industry, 5, pp. 39-47, (2017)
[4]  
Medium and long term special plan for energy conservation, Energy Conservation and Environmental Protection, 11, pp. 3-10, (2004)
[5]  
Ping Zuo-Wei, He Wei, Li Jun-Lin, Yang Tao, Sparse learning for load modeling in microgrids, Acta Automatica Sinica, 46, 9, pp. 1798-1808, (2020)
[6]  
Yao W F, Zhao J H, Wen F S, Dong Z Y, Xue Y S, Xu Y, Et al., A multi-objective collaborative planning strategy for integrated power distribution and electric vehicle charging systems, IEEE Transactions on Power Systems, 29, 4, pp. 1811-1821, (2014)
[7]  
Moghaddam M P, Damavandi M Y, Bahramara S, Haghifam M R., Modeling the impact of multi-energy players on electricity market in smart grid environment, Proceedings of the 2016 IEEE Innovative Smart Grid Technologies-Asia (ISGT-Asia), pp. 454-459, (2016)
[8]  
Carli R, Dotoli M., Decentralized control for residential energy management of a smart users' microgrid with renewable energy exchange, IEEE/CAA Journal of Automatica Sinica, 6, 3, pp. 641-656, (2019)
[9]  
Farrokhifar M, Aghdam F H, Alahyari A, Monavari A, Safari A., Optimal energy management and sizing of renewable energy and battery systems in residential sectors via a stochastic MILP model, Electric Power Systems Research, 187, (2020)
[10]  
Moser A, Muschick D, Golles M, Nageler P, Schranzhofer H, Mach T, Et al., A MILP-based modular energy management system for urban multi-energy systems: Performance and sensitivity analysis, Applied Energy, 261, (2020)