Cost-Effective Power Delivery via Deep Reinforcement Learning-Based Dynamic Electric Vehicle Transportation

被引:9
作者
Bao, Zheng [1 ]
Tang, Changbing [2 ]
Yu, Xinghuo [3 ]
Lin, Feilong [1 ]
Wen, Guanghui [4 ]
Zheng, Zhonglong [1 ]
机构
[1] Zhejiang Normal Univ, Sch Comp Sci & Technol, Jinhua 321004, Peoples R China
[2] Zhejiang Normal Univ, Coll Phys & Elect Informat Engn, Jinhua 321004, Peoples R China
[3] RMIT Univ, Sch Engn, Melbourne, Vic 3000, Australia
[4] Southeast Univ, Sch Automat, Nanjing 211189, Peoples R China
基金
中国国家自然科学基金;
关键词
Power transmission lines; Batteries; Costs; Transportation; Power system reliability; Load modeling; Load flow; Reliability; Power system dynamics; Load shedding; Electric vehicle; load shedding; Markov decision process (MDP); power delivery; reinforcement learning (RL); MANAGEMENT;
D O I
10.1109/JIOT.2025.3552823
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Power delivery issues are increasingly evident in cyber-physical smart grid systems as energy transactions frequently overlook the physical constraints of distribution, leading to transmission congestion and compromising network security and reliability. This article presents a novel and cost-effective solution to power delivery challenges by utilizing electric vehicles (EVs) with dynamic transportation capabilities as free carriers. Unlike traditional approaches, a deep reinforcement learning (DRL)-based optimization framework is designed to effectively manage incomplete information in real-time. Our method first introduces an investment-free model that leverages existing EV routes to transport energy during congestion, operating in a "free-riding" transmission mode. This not only enhances network reliability but also curtails costs. Then, we develop a Markov decision process (MDP) for sequential decision-making of 24-h optimal control, aimed at minimizing operational losses including load shedding and battery degradation. To deal with the stochastic nature of energy requests and EV routes in the control problem, we employ a model-free DRL algorithm to tackle the challenge of incomplete information. An Actor-Critic network, combining value-based and policy-based approaches, helps discover approximately optimal strategies in a continuous action space. Finally, the simulation results numerically demonstrate the performance of the proposed method.
引用
收藏
页码:23245 / 23256
页数:12
相关论文
共 27 条
[1]   Feasibility Study for Electric Vehicle Usage in a Microgrid Integrated With Renewable Energy [J].
Abuelrub, Ahmad ;
Hamed, Fadi ;
Hedel, Jehad ;
Al-Masri, Hussein M. K. .
IEEE TRANSACTIONS ON TRANSPORTATION ELECTRIFICATION, 2023, 9 (03) :4306-4315
[2]   Multiagent Reinforcement Learning for Energy Management in Residential Buildings [J].
Ahrarinouri, Mehdi ;
Rastegar, Mohammad ;
Seifi, Ali Reza .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (01) :659-666
[3]   Dynamic Operating Envelope-Enabled P2P Trading to Maximize Financial Returns of Prosumers [J].
Azim, M. Imran ;
Lankeshwara, Gayan ;
Tushar, Wayes ;
Sharma, Rahul ;
Alam, Mollah R. ;
Saha, Tapan K. ;
Khorasany, Mohsen ;
Razzaghi, Reza .
IEEE TRANSACTIONS ON SMART GRID, 2024, 15 (02) :1978-1990
[4]   A linear programming approach for battery degradation analysis and optimization in offgrid power systems with solar energy integration [J].
Bordin, Chiara ;
Anuta, Harold Oghenetejiri ;
Crossland, Andrew ;
Gutierrez, Isabel Lascurain ;
Dent, Chris J. ;
Vigo, Daniele .
RENEWABLE ENERGY, 2017, 101 :417-430
[5]   A new mathematical model and solution method for the asymmetric traveling salesman problem with replenishment arcs [J].
Bulbul, K. Gulnaz ;
Kasimbeyli, Refail .
APPLIED MATHEMATICS AND COMPUTATION, 2025, 494
[6]   Deep Reinforcement Learning-Based Hierarchical Energy Control Strategy of a Platoon of Connected Hybrid Electric Vehicles Through Cloud Platform [J].
Guo, Jinghua ;
Wang, Jingyao ;
Xu, Qing ;
Wang, Ban ;
Li, Keqiang .
IEEE TRANSACTIONS ON TRANSPORTATION ELECTRIFICATION, 2024, 10 (01) :305-315
[7]   Transition to electric vehicles in China: Implications for private motorization rate and battery market [J].
Hsieh, I-Yun Lisa ;
Pan, Menghsuan Sam ;
Green, William H. .
ENERGY POLICY, 2020, 144
[8]   An Overview of the Smart Grid in Great Britain [J].
Jenkins, Nick ;
Long, Chao ;
Wu, Jianzhong .
ENGINEERING, 2015, 1 (04) :413-421
[9]   Peer-to-Peer Energy Trading Using Prediction Intervals of Renewable Energy Generation [J].
Jia, Yanbo ;
Wan, Can ;
Cui, Wenkang ;
Song, Yonghua ;
Ju, Ping .
IEEE TRANSACTIONS ON SMART GRID, 2023, 14 (02) :1454-1465
[10]   V2V Energy Trading Scheme Based on Collaborative Edge Computing and Lightning Network [J].
Jiang, Wenxian ;
Guan, Zhenglei .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (11) :18252-18263