Intelligent Electric Vehicle Charging Scheduling in Transportation-Energy Nexus with Distributional Reinforcement Learning

被引:3
作者
Chen, Tao [1 ,2 ]
Gao, Ciwei [1 ,2 ]
机构
[1] Southeast Univ, Sch Elect Engn, Nanjing 210096, Peoples R China
[2] Southeast Univ, Jiangsu Prov Key Lab Smart Grid Technol & Equipme, Nanjing 210096, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
10.1109/JAS.2023.123285
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dear Editor, This letter is concerned with electric vehicle (EV) charging scheduling problem in transportation-energy nexus using an intelligent decision-making strategy with probabilistic self-adaptability features. In order to accommodate the coupling effects of stochastic EV driving behavior on transport network and distribution network, a risk-captured distributional reinforcement learning solution is presented by using explicit probabilistic information for action and reward function in Markov decision process (MDP) model, where the Bellman equation is extended to a more generalized version. Scheduling EV charging in a transportation-energy nexus, according to both transport and distribution network conditions, is an important topic in recent studies to improve the driving and charging energy efficiency, especially considering the high penetration rate of EV nowadays and even more extremely higher one in the future [1]. In order to accommodate the coupling effects of stochastic EV driving behavior and battery state-of-charge (SoC) on transport and distribution network, various methods have been developed for designing the smart charging scheduling strategy with consideration of electricity price, renewable energy adoption, road conditions and many others.
引用
收藏
页码:2171 / 2173
页数:3
相关论文
共 8 条
[1]  
Bellemare MG, 2017, PR MACH LEARN RES, V70
[2]   Optimal Routing and Charging of an Electric Vehicle Fleet for High-Efficiency Dynamic Transit Systems [J].
Chen, Tao ;
Zhang, Bowen ;
Pourbabak, Hajir ;
Kavousi-Fard, Abdollah ;
Su, Wencong .
IEEE TRANSACTIONS ON SMART GRID, 2018, 9 (04) :3563-3572
[3]   Deep Reinforcement Learning for Intelligent Transportation Systems: A Survey [J].
Haydari, Ammar ;
Yilmaz, Yasin .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (01) :11-32
[4]   Constrained EV Charging Scheduling Based on Safe Deep Reinforcement Learning [J].
Li, Hepeng ;
Wan, Zhiqiang ;
He, Haibo .
IEEE TRANSACTIONS ON SMART GRID, 2020, 11 (03) :2427-2439
[5]   Mobility-Aware Charging Scheduling for Shared On-Demand Electric Vehicle Fleet Using Deep Reinforcement Learning [J].
Liang, Yanchang ;
Ding, Zhaohao ;
Ding, Tao ;
Lee, Wei-Jen .
IEEE TRANSACTIONS ON SMART GRID, 2021, 12 (02) :1380-1393
[6]   Network Equilibrium of Coupled Transportation and Power Distribution Systems [J].
Wei, Wei ;
Wu, Lei ;
Wang, Jianhui ;
Mei, Shengwei .
IEEE TRANSACTIONS ON SMART GRID, 2018, 9 (06) :6764-6779
[7]   Power and Transport Nexus: Routing Electric Vehicles to Promote Renewable Power Integration [J].
Zhang, Hongcai ;
Hu, Zechun ;
Song, Yonghua .
IEEE TRANSACTIONS ON SMART GRID, 2020, 11 (04) :3291-3301
[8]   A Second-Order Cone Programming Model for Planning PEV Fast-Charging Stations [J].
Zhang, Hongcai ;
Moura, Scott J. ;
Hu, Zechun ;
Qi, Wei ;
Song, Yonghua .
IEEE TRANSACTIONS ON POWER SYSTEMS, 2018, 33 (03) :2763-2777