Residential demand response online optimization based on multi-agent deep reinforcement learning

被引:1
作者
Yuan, Quan [1 ]
机构
[1] Wuxi Univ, Sch Automat, Wuxi, Peoples R China
关键词
Demand response; Multi-agent deep reinforcement learning; Electric vehicle; Online energy management; MANAGEMENT;
D O I
10.1016/j.epsr.2024.110987
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Demand side management in the context of smart grids exploits the value stream of load flexibility, promoting stable and economic operation of the power grid. To address multi-uncertainties such as device characteristics, distributed renewable generation, and residential electricity demand, this paper proposes a model-free framework based on multi-agent deep reinforcement learning to achieve efficient online optimization of demand response strategies. Firstly, the demand side management is formulated as a partially observable Markov game. Then, a multi-agent soft actor-critic algorithm is proposed, combining a temporal electricity price feature extraction technology to improve learning efficiency. Reward correction mechanism is designed to avoid new charging and discharging peaks in the demand response. Finally, case studies demonstrate the effectiveness of the proposed method in reducing the electricity cost and aggregate peak demand of residents. The proposed method reduces the daily average cost by 21.40%, 8.09%, and 5.88% compared to MAPPO, MADDPG, and SUMADRL, and exhibits the highest training efficiency and scalability against the benchmarks.
引用
收藏
页数:6
相关论文
共 23 条
[1]   Energy management for demand response in networked greenhouses with multi-agent deep reinforcement learning [J].
Ajagekar, Akshay ;
Decardi-Nelson, Benjamin ;
You, Fengqi .
APPLIED ENERGY, 2024, 355
[2]   Real-Time Pricing-Enabled Demand Response Using Long Short-Time Memory Deep Learning [J].
Almani, Aftab Ahmed ;
Han, Xueshan .
ENERGIES, 2023, 16 (05)
[3]   Distribution System Services Provided by Electric Vehicles: Recent Status, Challenges, and Future Prospects [J].
Arias, Nataly Banol ;
Hashemi, Seyedmostafa ;
Andersen, Peter Bach ;
Traeholt, Chresten ;
Romero, Ruben .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2019, 20 (12) :4277-4296
[4]   Deep Reinforcement Learning for Demand Response in Distribution Networks [J].
Bahrami, Shahab ;
Chen, Yu Christine ;
Wong, Vincent W. S. .
IEEE TRANSACTIONS ON SMART GRID, 2021, 12 (02) :1496-1506
[5]   A comprehensive overview on demand side energy management towards smart grids: challenges, solutions, and future direction [J].
Bakare M.S. ;
Abdulkarim A. ;
Zeeshan M. ;
Shuaibu A.N. .
Energy Informatics, 2023, 6 (01)
[6]   Characterizing Capacity of Flexible Loads for Providing Grid Support [J].
Coffman, Austin R. ;
Guo, Zhong ;
Barooah, Prabir .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2021, 36 (03) :2428-2437
[7]   A comprehensive review of demand side management in distributed grids based on real estate perspectives [J].
Dahiru, Ahmed Tijjani ;
Daud, Dzurllkanian ;
Tan, Chee Wei ;
Jagun, Zainab Toyin ;
Samsudin, Salfarina ;
Dobi, Abdulhakeem Mohammed .
ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH, 2023, 30 (34) :81984-82013
[8]   Optimal Demand Response Management of a Residential Microgrid Using Model Predictive Control [J].
Freire, Vlademir A. ;
Ramos De Arruda, Lucia Valeria ;
Bordons, Carlos ;
Jose Marquez, Juan .
IEEE ACCESS, 2020, 8 :228264-228276
[9]   Enhancing Grid-Interactive Buildings Demand Response: Sequential Update-Based Multiagent Deep Reinforcement Learning Approach [J].
Han, Yinghua ;
Wu, Jingrun ;
Chen, Haoqi ;
Si, Fangyuan ;
Cao, Zhiao ;
Zhao, Qiang .
IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (14) :24439-24451
[10]   Real-time pricing method for VPP demand response based on PER-DDPG algorithm [J].
Kong, Xiangyu ;
Lu, Wenqi ;
Wu, Jianzhong ;
Wang, Chengshan ;
Zhao, Xv ;
Hu, Wei ;
Shen, Yu .
ENERGY, 2023, 271