Power output optimization of electric vehicles smart charging hubs using deep reinforcement learning

被引:21
作者
Bertolini, Andrea [1 ]
Martins, Miguel S. E. [1 ]
Vieira, Susana M. [1 ]
Sousa, Joao M. C. [1 ]
机构
[1] Univ Lisbon, IDMEC, Inst Super Tecn, 1 Ave Rovisco Pais, P-1049001 Lisbon, Portugal
关键词
Reinforcement learning; Electric vehicles; Real-time charging scheduling; Neural network; Clustering algorithm;
D O I
10.1016/j.eswa.2022.116995
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since most branches of the distribution grid may already be close to their maximum capacity, smart management when charging electric vehicles (EVs) is becoming more and more crucial. In fact, office buildings might not be able to handle several transactions at the same time, especially considering the next generation of fast chargers which are very power expensive. Thus, an efficient charging policy needs to be found. This paper proposes the scheduling of real-time EVs charging through deep reinforcement learning (DRL) techniques. DRL has been chosen because it can adaptively learn from interacting with the surrounding environment. The focus of the optimization is to ensure the completion of the charging transactions in a timely manner, while shifting the load from the times of peak demand. The novelty of the proposed approach lies in its innovative framework: pools of electric vehicles with different characteristics are categorized using a clustering algorithm, a tree-based classifier has been developed to sort new instances of EVs, and a multilayer perceptron artificial deep neural network has been trained to predict the expected duration of each charging session. These features are used as inputs to the DRL agent, and are mapped into actions that adjust the maximum power associated to each charging station. The model has been compared to a traditional charging algorithm and increasingly challenging scenarios have been considered. Results have shown that the developed algorithm fails less than the baseline, with a reduction of the load due to EVs charging of 80% during peak times.
引用
收藏
页数:11
相关论文
共 38 条
[1]  
Al Zishan Abdullah, 2020, e-Energy '20: Proceedings of the Eleventh ACM International Conference on Future Energy Systems, P116, DOI 10.1145/3396851.3397706
[2]  
Alliance O. C., 2015, Open charge point protocol 1.6
[3]   A review of EVs charging: From the perspective of energy optimization, optimization approaches, and charging techniques [J].
Amjad, Muhammad ;
Ahmad, Ayaz ;
Rehmani, Mubashir Husain ;
Umer, Tariq .
TRANSPORTATION RESEARCH PART D-TRANSPORT AND ENVIRONMENT, 2018, 62 :386-417
[4]  
[Anonymous], 2019, Innovation Outlook. Smart Charging for Electric vehicles
[5]   A Q-Learning Based Charging Scheduling Scheme for Electric Vehicles [J].
Dang, Qiyun ;
Wu, Di ;
Boulet, Benoit .
2019 IEEE TRANSPORTATION ELECTRIFICATION CONFERENCE AND EXPO (ITEC), 2019,
[6]   Optimal Electric Vehicle Charging Strategy With Markov Decision Process and Reinforcement Learning Technique [J].
Ding, Tao ;
Zeng, Ziyu ;
Bai, Jiawen ;
Qin, Boyu ;
Yang, Yongheng ;
Shahidehpour, Mohammad .
IEEE TRANSACTIONS ON INDUSTRY APPLICATIONS, 2020, 56 (05) :5811-5823
[7]   Genetic Algorithm for Optimal Charge Scheduling of Electric Vehicle Fleet [J].
Elmehdi, Mabrouk ;
Abdelilah, Maach .
PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON NETWORKING, INFORMATION SYSTEMS & SECURITY (NISS19), 2019,
[8]   Multi-Agent Reinforcement Learning Approach for Residential Microgrid Energy Scheduling [J].
Fang, Xiaohan ;
Wang, Jinkuan ;
Song, Guanru ;
Han, Yinghua ;
Zhao, Qiang ;
Cao, Zhiao .
ENERGIES, 2020, 13 (01)
[9]  
Geron A., 2017, Hands-On Machine Learning With Scikit-Learn, V1st edn.
[10]  
Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1