Deep Reinforcement Learning-Based Real-Time Energy Management for an Integrated Electric-Thermal Energy System

被引:4
作者
Shuai, Qiang [1 ]
Yin, Yue [1 ]
Huang, Shan [1 ]
Chen, Chao [1 ]
机构
[1] Sichuan Univ, Coll Elect Engn, Chengdu 610065, Peoples R China
关键词
integrated electric-thermal energy system; energy management; renewable energy; deep reinforcement learning; improved proximal policy optimization algorithm;
D O I
10.3390/su17020407
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Renewable energy plays a crucial role in achieving sustainable development and has the potential to meet humanity's long-term energy requirements. Integrated electric-thermal energy systems are an important way to consume a high proportion of renewable energy. The intermittency and volatility of integrated electric-thermal energy systems make solving energy management optimization problems difficult. Thus, this paper proposes an energy management optimization method for an integrated electric-thermal energy system based on the improved proximal policy optimization algorithm, which effectively mitigates the problems of the traditional heuristic algorithms or mathematical planning methods with low accuracy and low solving efficiency. Meanwhile, the proposed algorithm enhances both the convergence speed and overall performance compared to the proximal policy optimization algorithm. This paper first establishes a mathematical model for the energy management of an integrated electric-thermal energy system. Then, the model is formulated as a Markov decision process, and a reward mechanism is designed to guide the agent to learn the uncertainty characteristics of renewable energy output and load consumption in the system through historical data. Finally, in the case study section, the proposed algorithm reduces the average running cost by 2.32% compared to the other algorithms discussed in this paper, thereby demonstrating its effectiveness and cost-efficiency.
引用
收藏
页数:17
相关论文
共 34 条
[1]   A Brief Review of Microgrid Surveys, by Focusing on Energy Management System [J].
Abdi, Hamdi .
SUSTAINABILITY, 2023, 15 (01)
[2]   Robust Optimization-Based Optimal Operation of Islanded Microgrid Considering Demand Response [J].
AlDavood, Monir Sadat ;
Mehbodniya, Abolfazl ;
Webber, Julian L. ;
Ensaf, Mohammad ;
Azimian, Mahdi .
SUSTAINABILITY, 2022, 14 (21)
[3]  
Aldo B., 2014, Energy, V76, P168
[4]   Energy Management in a Standalone Microgrid: A Split-Horizon Dual-Stage Dispatch Strategy [J].
Amir, Aslam ;
Shareef, Hussain ;
Awwad, Falah .
ENERGIES, 2023, 16 (08)
[5]   Model-based and model-free "plug-and-play" building energy efficient control [J].
Baldi, Simone ;
Michailidis, Iakovos ;
Ravanis, Christos ;
Kosmatopoulos, Elias B. .
APPLIED ENERGY, 2015, 154 :829-841
[6]   Optimal stochastic coordinated scheduling of proton exchange membrane fuel cell-combined heat and power, wind and photovoltaic units in micro grids considering hydrogen storage [J].
Bornapour, Mosayeb ;
Hooshmand, Rahmat-Allah ;
Khodabakhshian, Amin ;
Parastegari, Moein .
APPLIED ENERGY, 2017, 202 :308-322
[7]   An optimization on an integrated energy system of combined heat and power, carbon capture system and power to gas by considering flexible load [J].
Chen, Maozhi ;
Lu, Hao ;
Chang, Xiqiang ;
Liao, Haiyan .
ENERGY, 2023, 273
[8]   Microgrid system energy management with demand response program for clean and economical operation [J].
Dey, Bishwajit ;
Misra, Srikant ;
Marquez, Fausto Pedro Garcia .
APPLIED ENERGY, 2023, 334
[9]   A CCP-based distributed cooperative operation strategy for multi-agent energy systems integrated with wind, solar, and buildings [J].
Ding, Bing ;
Li, Zening ;
Li, Zhengmao ;
Xue, Yixun ;
Chang, Xinyue ;
Su, Jia ;
Jin, Xiaolong ;
Sun, Hongbin .
APPLIED ENERGY, 2024, 365
[10]   Deep-Reinforcement-Learning-Based Autonomous Voltage Control for Power Grid Operations [J].
Duan, Jiajun ;
Shi, Di ;
Diao, Ruisheng ;
Li, Haifeng ;
Wang, Zhiwei ;
Zhang, Bei ;
Bian, Desong ;
Yi, Zhehan .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2020, 35 (01) :814-817