Optimal economic dispatch of a virtual power plant based on gated recurrent unit proximal policy optimization

被引:0
作者
Gao, Zhiping [1 ]
Kang, Wenwen [1 ]
Chen, Xinghua [1 ]
Gong, Siru [1 ]
Liu, Zongxiong [1 ]
He, Degang [2 ]
Shi, Shen [3 ]
Shangguan, Xing-Chen [3 ]
机构
[1] State Power Investment Grp Co Ltd, Hubei Branch, Wuhan, Peoples R China
[2] Inst New Energy, Wuhan, Peoples R China
[3] China Univ Geosci, Sch Automat, Wuhan 430074, Peoples R China
关键词
virtual power plant; demand response; deep reinforcement learning; gated recurrent unit; proximal policy optimization; DEMAND RESPONSE; ENERGY; MULTIENERGY; INTEGRATION; MANAGEMENT; INTERNET; STRATEGY; MARKETS; WIND;
D O I
10.3389/fenrg.2024.1357406
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
The intermittent renewable energy in a virtual power plant (VPP) brings generation uncertainties, which prevents the VPP from providing a reliable and user-friendly power supply. To address this issue, this paper proposes a gated recurrent unit proximal policy optimization (GRUPPO)-based optimal VPP economic dispatch method. First, electrical generation, storage, and consumption are established to form a VPP framework by considering the accessibility of VPP state information. The optimal VPP economic dispatch can then be expressed as a partially observable Markov decision process (POMDP) problem. A novel deep reinforcement learning method called GRUPPO is further developed based on VPP time series characteristics. Finally, case studies are conducted over a 24-h period based on the actual historical data. The test results illustrate that the proposed economic dispatch can achieve a maximum operation cost reduction of 6.5% and effectively smooth the supply-demand uncertainties.
引用
收藏
页数:14
相关论文
共 28 条
  • [1] Double Deep Q-Learning-Based Distributed Operation of Battery Energy Storage System Considering Uncertainties
    Bui, Van-Hai
    Hussain, Akhtar
    Kim, Hak-Man
    [J]. IEEE TRANSACTIONS ON SMART GRID, 2020, 11 (01) : 457 - 469
  • [2] Multi-head CNN-RNN for multi-time series anomaly detection: An industrial case study
    Canizo, Mikel
    Triguero, Isaac
    Conde, Angel
    Onieva, Enrique
    [J]. NEUROCOMPUTING, 2019, 363 : 246 - 260
  • [3] A fully distributed ADMM-based dispatch approach for virtual power plant problems
    Chen, Guo
    Li, Jueyou
    [J]. APPLIED MATHEMATICAL MODELLING, 2018, 58 : 300 - 312
  • [4] Efficient Forecasting Scheme and Optimal Delivery Approach of Energy for the Energy Internet
    Du, Liufeng
    Zhang, Linghua
    Tian, Xiyan
    Lei, Jinhui
    [J]. IEEE ACCESS, 2018, 6 : 15026 - 15038
  • [5] Virtual Power Plant for Grid Services Using IEC 61850
    Etherden, Nicholas
    Vyatkin, Valeriy
    Bollen, Math H. J.
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2016, 12 (01) : 437 - 447
  • [6] Operation of a Technical Virtual Power Plant Considering Diverse Distributed Energy Resources
    Gough, Matthew
    Santos, Sergio F.
    Lotfi, Mohamed
    Javadi, Mohammad Sadegh
    Osorio, Gerardo J.
    Ashraf, Paul
    Castro, Rui
    Catalao, Joao P. S.
    [J]. IEEE TRANSACTIONS ON INDUSTRY APPLICATIONS, 2022, 58 (02) : 2547 - 2558
  • [7] Multi-agent deep reinforcement learning: a survey
    Gronauer, Sven
    Diepold, Klaus
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2022, 55 (02) : 895 - 943
  • [8] Optimal energy management strategies for energy Internet via deep reinforcement learning approach
    Hua, Haochen
    Qin, Yuchao
    Hao, Chuantong
    Cao, Junwei
    [J]. APPLIED ENERGY, 2019, 239 (598-609) : 598 - 609
  • [9] A Control Strategy Based on Deep Reinforcement Learning Under the Combined Wind-Solar Storage System
    Huang, Shiying
    Li, Peng
    Yang, Ming
    Gao, Yuan
    Yun, Jiangyang
    Zhang, Changhang
    [J]. IEEE TRANSACTIONS ON INDUSTRY APPLICATIONS, 2021, 57 (06) : 6547 - 6558
  • [10] Wind and Solar Power Integration in Electricity Markets and Distribution Networks Through Service-Centric Virtual Power Plants
    Koraki, Despina
    Strunz, Kai
    [J]. IEEE TRANSACTIONS ON POWER SYSTEMS, 2018, 33 (01) : 473 - 485