Deep Reinforcement Learning Based Optimal Route and Charging Station Selection

被引:39
作者
Lee, Ki-Beom [1 ]
Ahmed, Mohamed A. [2 ,3 ]
Kang, Dong-Ki [1 ]
Kim, Young-Chon [1 ]
机构
[1] Jeonbuk Natl Univ, Dept Comp Engn, Div Elect & Informat, Jeonju 54896, South Korea
[2] Univ Tecn Federico Santa Maria, Dept Elect Engn, Valparaiso 2390123, Chile
[3] Higher Inst Engn & Technol King Marriott, Dept Commun & Elect, Alexandria 23713, Egypt
关键词
electric vehicle; electric vehicle charging station; intelligent transport system; electric vehicle charging navigation system; Markov decision process; deep reinforcement learning; NAVIGATION;
D O I
10.3390/en13236255
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
This paper proposes an optimal route and charging station selection (RCS) algorithm based on model-free deep reinforcement learning (DRL) to overcome the uncertainty issues of the traffic conditions and dynamic arrival charging requests. The proposed DRL based RCS algorithm aims to minimize the total travel time of electric vehicles (EV) charging requests from origin to destination using the selection of the optimal route and charging station considering dynamically changing traffic conditions and unknown future requests. In this paper, we formulate this RCS problem as a Markov decision process model with unknown transition probability. A Deep Q network has been adopted with function approximation to find the optimal electric vehicle charging station (EVCS) selection policy. To obtain the feature states for each EVCS, we define the traffic preprocess module, charging preprocess module and feature extract module. The proposed DRL based RCS algorithm is compared with conventional strategies such as minimum distance, minimum travel time, and minimum waiting time. The performance is evaluated in terms of travel time, waiting time, charging time, driving time, and distance under the various distributions and number of EV charging requests.
引用
收藏
页数:22
相关论文
共 31 条
[1]  
CAO YC, 2019, P 2019 1 INT C IND A, V32, P183, DOI DOI 10.1017/S0954422419000052
[2]  
Cao Y, 2017, INT WIREL COMMUN, P1471, DOI 10.1109/IWCMC.2017.7986501
[3]   Optimal Delivery Scheduling and Charging of EVs in the Navigation of a City Map [J].
Cerna, Fernando V. ;
Pourakbari-Kasmaei, Mahdi ;
Romero, Ruben A. ;
Rider, Marcos J. .
IEEE TRANSACTIONS ON SMART GRID, 2018, 9 (05) :4815-4827
[4]   A dynamic multi source Dijkstra's algorithm for vehicle routing [J].
Eklund, PW ;
Kirkby, S ;
Pollitt, S .
ANZIIS 96 - 1996 AUSTRALIAN NEW ZEALAND CONFERENCE ON INTELLIGENT INFORMATION SYSTEMS, PROCEEDINGS, 1996, :329-333
[5]   Possibilities and Challenges for the Inclusion of the Electric Vehicle (EV) to Reduce the Carbon Footprint in the Transport Sector: A Review [J].
Ghosh, Aritra .
ENERGIES, 2020, 13 (10)
[6]   Rapid-Charging Navigation of Electric Vehicles Based on Real-Time Power Systems and Traffic Data [J].
Guo, Qinglai ;
Xin, Shujun ;
Sun, Hongbin ;
Li, Zhengshuo ;
Zhang, Boming .
IEEE TRANSACTIONS ON SMART GRID, 2014, 5 (04) :1969-1979
[7]   Optimizing Electric Vehicle Charging: A Customer's Perspective [J].
Jin, Chenrui ;
Tang, Jian ;
Ghosh, Prasanta .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2013, 62 (07) :2919-2927
[8]   Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings [J].
Kim, Sunyong ;
Lim, Hyuk .
ENERGIES, 2018, 11 (08)
[9]   Reinforcement Learning-Based Energy Management of Smart Home with Rooftop Solar Photovoltaic System, Energy Storage System, and Home Appliances [J].
Lee, Sangyoon ;
Choi, Dae-Hyun .
SENSORS, 2019, 19 (18)
[10]   An Analysis of Price Competition in Heterogeneous Electric Vehicle Charging Stations [J].
Lee, Woongsup ;
Schober, Robert ;
Wong, Vincent W. S. .
IEEE TRANSACTIONS ON SMART GRID, 2019, 10 (04) :3990-4002