Energy Management in Microgrids Using Model-Free Deep Reinforcement Learning Approach

被引:1
作者
Talab, Odia A. [1 ]
Avci, Isa [1 ]
机构
[1] Fac Engn, Dept Comp Engn, TR-78050 Karabuk, Turkiye
来源
IEEE ACCESS | 2025年 / 13卷
关键词
Energy management; Uncertainty; Renewable energy sources; Load modeling; Costs; Microgrids; Batteries; Wind turbines; Power system stability; Power system dynamics; DDPG; RESs; energy management; FCSs; microgrid; EVs; SYSTEMS; OPERATION; WEATHER;
D O I
10.1109/ACCESS.2025.3525843
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Electric power systems are undergoing rapid modernization driven by advancements in smart-grid technologies, and microgrids (MGs) play a crucial role in integrating renewable energy sources (RESs), such as wind and solar energy, into existing grids. MGs offer a flexible and efficient framework for accommodating dispersed energy resources. However, the intermittent nature of renewable sources, coupled with the rising demand for Electric Vehicles (EVs) and fast charging stations (FCSs), poses significant challenges to the stability and efficiency of microgrid (MG) operations. These challenges stem from the uncertainties in both energy generation and fluctuating demand patterns, making efficient energy management in MG a complex task. This study introduces a novel model-free strategy for real-time energy management in MG aimed at addressing uncertainties without the need for traditional uncertainty modeling techniques. Unlike conventional methods, the proposed approach enhances MG performance by minimizing power losses and operational costs. The problem is formulated as a Markov Decision Process (MDP) with well-defined objectives. To optimize decision-making, an actor-critic-based Deep Deterministic Policy Gradient (DDPG) algorithm is developed, leveraging reinforcement learning (RL) to adapt dynamically to changing system conditions. Comprehensive numerical simulations demonstrated the effectiveness of the proposed strategy. The results show a total cost of 51.8770 <euro>ct/kWh, representing a reduction of 3.19% compared to the Dueling Deep Q Network (Dueling DQN) and 4% compared to the Deep Q Network (DQN). This highlights the robustness and scalability of the proposed model-free approach for modern MG energy management.
引用
收藏
页码:5871 / 5891
页数:21
相关论文
共 52 条
[11]   Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning [J].
Guo, Chenyu ;
Wang, Xin ;
Zheng, Yihui ;
Zhang, Feng .
ENERGY, 2022, 238
[12]   Optimal energy management strategies for energy Internet via deep reinforcement learning approach [J].
Hua, Haochen ;
Qin, Yuchao ;
Hao, Chuantong ;
Cao, Junwei .
APPLIED ENERGY, 2019, 239 (598-609) :598-609
[13]   Data-Driven Fast Uncertainty Assessment of Distribution Systems With Correlated EV Charging Demand and Renewable Generation [J].
Jiang, Yazhou ;
Ortmeyer, Thomas ;
Fan, Miao .
IEEE TRANSACTIONS ON SUSTAINABLE ENERGY, 2023, 14 (03) :1446-1456
[14]   Deep learning in power systems research: A review [J].
Khodayar, Mandi ;
Liu, Guangyi ;
Wang, Jianhui ;
Khodayar, Mohammad E. .
CSEE JOURNAL OF POWER AND ENERGY SYSTEMS, 2021, 7 (02) :209-220
[15]   Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings [J].
Kim, Sunyong ;
Lim, Hyuk .
ENERGIES, 2018, 11 (08)
[16]   Forecasting Charging Demand of Electric Vehicles Using Time-Series Models [J].
Kim, Yunsun ;
Kim, Sahm .
ENERGIES, 2021, 14 (05)
[17]   Multi-Objective Optimization of PV and Energy Storage Systems for Ultra-Fast Charging Stations [J].
Leone, Carola ;
Longo, Michela ;
Fernandez-Ramirez, Luis M. ;
Garcia-Trivino, Pablo .
IEEE ACCESS, 2022, 10 :14208-14224
[18]   Energy Management for Microgrids: a Reinforcement Learning Approach [J].
Levent, Tanguy ;
Preux, Philippe ;
Le Pennec, Erwan ;
Badosa, Jordi ;
Henri, Gonzague ;
Bonnassieux, Yvan .
PROCEEDINGS OF 2019 IEEE PES INNOVATIVE SMART GRID TECHNOLOGIES EUROPE (ISGT-EUROPE), 2019,
[19]   Combined Two-Stage Stochastic Programming and Receding Horizon Control Strategy for Microgrid Energy Management Considering Uncertainty [J].
Li, Zhongwen ;
Zang, Chuanzhi ;
Zeng, Peng ;
Yu, Haibin .
ENERGIES, 2016, 9 (07)
[20]   Sequence Generative Adversarial Networks for Wind Power Scenario Generation [J].
Liang, Junkai ;
Tang, Wenyuan .
IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2020, 38 (01) :110-118