Residential demand response online optimization based on multi-agent deep reinforcement learning

被引:1
作者
Yuan, Quan [1 ]
机构
[1] Wuxi Univ, Sch Automat, Wuxi, Peoples R China
关键词
Demand response; Multi-agent deep reinforcement learning; Electric vehicle; Online energy management; MANAGEMENT;
D O I
10.1016/j.epsr.2024.110987
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Demand side management in the context of smart grids exploits the value stream of load flexibility, promoting stable and economic operation of the power grid. To address multi-uncertainties such as device characteristics, distributed renewable generation, and residential electricity demand, this paper proposes a model-free framework based on multi-agent deep reinforcement learning to achieve efficient online optimization of demand response strategies. Firstly, the demand side management is formulated as a partially observable Markov game. Then, a multi-agent soft actor-critic algorithm is proposed, combining a temporal electricity price feature extraction technology to improve learning efficiency. Reward correction mechanism is designed to avoid new charging and discharging peaks in the demand response. Finally, case studies demonstrate the effectiveness of the proposed method in reducing the electricity cost and aggregate peak demand of residents. The proposed method reduces the daily average cost by 21.40%, 8.09%, and 5.88% compared to MAPPO, MADDPG, and SUMADRL, and exhibits the highest training efficiency and scalability against the benchmarks.
引用
收藏
页数:6
相关论文
共 23 条
[11]   Impact of demand side management approaches for the enhancement of voltage stability loadability and customer satisfaction index [J].
Kumar, Abhishek ;
Deng, Yan ;
He, Xiangning ;
Singh, Arvind R. ;
Kumar, Praveen ;
Bansal, R. C. ;
Bettayeb, M. ;
Ghenai, C. ;
Naidoo, R. M. .
APPLIED ENERGY, 2023, 339
[12]   Reliability assessment of distribution system using Petri net for enhancement of situational awareness [J].
Kumari, Rani ;
Naick, Bhukya K. ;
Ghosh, Debomita .
ELECTRIC POWER SYSTEMS RESEARCH, 2023, 224
[13]   Real-time demand response strategy base on price and incentive considering multi-energy in smart grid: A bi-level optimization method [J].
Luo, Yiling ;
Gao, Yan ;
Fan, Deli .
INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2023, 153
[14]   Load-Level Control Design for Demand Dispatch With Heterogeneous Flexible Loads [J].
Mathias, Joel ;
Busic, Ana ;
Meyn, Sean .
IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, 2023, 31 (04) :1830-1843
[15]   Coordinated Charge and Discharge Scheduling of Electric Vehicles for Load Curve Shaping [J].
Nimalsiri, Nanduni, I ;
Ratnam, Elizabeth L. ;
Smith, David B. ;
Mediwaththe, Chathurika P. ;
Halgamuge, Saman K. .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) :7653-7665
[16]   An Exponential Droop Control Strategy for Distributed Energy Storage Systems Integrated With Photovoltaics [J].
Nousdilis, Angelos I. ;
Kryonidis, Georgios C. ;
Kontis, Eleftherios O. ;
Christoforidis, Georgios C. ;
Papagiannis, Grigoris K. .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2021, 36 (04) :3317-3328
[17]   Deep Reinforcement Learning Method for Demand Response Management of Interruptible Load [J].
Wang, Biao ;
Li, Yan ;
Ming, Weiyu ;
Wang, Shaorong .
IEEE TRANSACTIONS ON SMART GRID, 2020, 11 (04) :3146-3155
[18]   Data-driven dynamic resource scheduling for network slicing: A Deep reinforcement learning approach [J].
Wang, Haozhe ;
Wu, Yulei ;
Min, Geyong ;
Xu, Jie ;
Tang, Pengcheng .
INFORMATION SCIENCES, 2019, 498 :106-116
[19]   Optimal Demand Response Using Device-Based Reinforcement Learning [J].
Wen, Zheng ;
O'Neill, Daniel ;
Maei, Hamid .
IEEE TRANSACTIONS ON SMART GRID, 2015, 6 (05) :2312-2324
[20]   Multi-Agent attention-based deep reinforcement learning for demand response in grid-responsive buildings* [J].
Xie, Jiahan ;
Ajagekar, Akshay ;
You, Fengqi .
APPLIED ENERGY, 2023, 342