Multi-agent deep deterministic policy gradient algorithm for peer-to-peer energy trading considering distribution network constraints

被引:47
作者
Samende, Cephas [1 ]
Cao, Jun [2 ]
Fan, Zhong [1 ]
机构
[1] Keele Univ, Sch Comp & Math, Keele, England
[2] LIST, Environm Res & Innovat Dept, Esch Sur Alzette, Luxembourg
基金
“创新英国”项目;
关键词
Multi-agent; Deep deterministic policy gradient; Peer-to-peer energy trading; Renewable generation; Markov decision process; POWER; MODEL;
D O I
10.1016/j.apenergy.2022.119123
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
In this paper, we investigate an energy cost minimization problem for prosumers participating in peer -to-peer energy trading. Due to (i) uncertainties caused by renewable energy generation and consumption, (ii) difficulties in developing an accurate and efficient energy trading model, and (iii) the need to satisfy distribution network constraints, it is challenging for prosumers to obtain optimal energy trading decisions that minimize their individual energy costs. To address the challenge, we first formulate the above problem as a Markov decision process and propose a multi-agent deep deterministic policy gradient algorithm to learn optimal energy trading decisions. To satisfy the distribution network constraints, we propose distribution network tariffs which we incorporate in the algorithm as incentives to incentivize energy trading decisions that help to satisfy the constraints and penalize the decisions that violate them. The proposed algorithm is model -free and allows the agents to learn the optimal energy trading decisions without having prior information about other agents in the network. Simulation results based on real-world datasets show the effectiveness and robustness of the proposed algorithm.
引用
收藏
页数:10
相关论文
共 46 条
[1]   Peer to Peer Distributed Energy Trading in Smart Grids: A Survey [J].
Abdella, Juhar ;
Shuaib, Khaled .
ENERGIES, 2018, 11 (06)
[2]   Towards transactive energy systems: An analysis on current trends [J].
Abrishambaf, Omid ;
Lezama, Fernando ;
Faria, Pedro ;
Vale, Zita .
ENERGY STRATEGY REVIEWS, 2019, 26
[3]   Distribution Locational Marginal Pricing (DLMP) for Congestion Management and Voltage Support [J].
Bai, Linquan ;
Wang, Jianhui ;
Wang, Chengshan ;
Chen, Chen ;
Li, Fangxing .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2018, 33 (04) :4061-4073
[4]   Real-time Energy Management of Microgrid Using Reinforcement Learning [J].
Bi, Wenzheng ;
Shu, Yuankai ;
Dong, Wei ;
Yang, Qiang .
2020 19TH INTERNATIONAL SYMPOSIUM ON DISTRIBUTED COMPUTING AND APPLICATIONS FOR BUSINESS ENGINEERING AND SCIENCE (DCABES 2020), 2020, :38-41
[5]  
Brockman Greg, 2016, arXiv
[6]   Deep Reinforcement Learning-Based Energy Storage Arbitrage With Accurate Lithium-Ion Battery Degradation Model [J].
Cao, Jun ;
Harrold, Dan ;
Fan, Zhong ;
Morstyn, Thomas ;
Healey, David ;
Li, Kang .
IEEE TRANSACTIONS ON SMART GRID, 2020, 11 (05) :4513-4521
[7]   Local Energy Trending Behavior Modeling With Deep Reinforcement Learning [J].
Chen, Tao ;
Su, Wencong .
IEEE ACCESS, 2018, 6 :62806-62814
[8]   Peer-to-Peer Energy Trading and Energy Conversion in Interconnected Multi-Energy Microgrids Using Multi-Agent Deep Reinforcement Learning [J].
Chen, Tianyi ;
Bu, Shengrong ;
Liu, Xue ;
Kang, Jikun ;
Yu, F. Richard ;
Han, Zhu .
IEEE TRANSACTIONS ON SMART GRID, 2022, 13 (01) :715-727
[9]   Realistic Peer-to-Peer Energy Trading Model for Microgrids using Deep Reinforcement Learning [J].
Chen, Tianyi ;
Bu, Shengrong .
PROCEEDINGS OF 2019 IEEE PES INNOVATIVE SMART GRID TECHNOLOGIES EUROPE (ISGT-EUROPE), 2019,
[10]   Proposed framework for blockchain technology in a decentralised energy network [J].
Dzobo, Oliver ;
Malila, Bessie ;
Sithole, Lindokhuhle .
PROTECTION AND CONTROL OF MODERN POWER SYSTEMS, 2021, 6 (01)