A deep reinforcement learning based charging and discharging scheduling strategy for electric vehicles

被引:1
作者
Xiao, Qin [1 ]
Zhang, Runtao [1 ]
Wang, Yongcan [2 ,3 ]
Shi, Peng [2 ,3 ]
Wang, Xi [2 ,3 ]
Chen, Baorui [2 ,3 ]
Fan, Chengwei [2 ,3 ]
Chen, Gang [2 ,3 ]
机构
[1] Southwest Jiaotong Univ, Sch Elect Engn, Chengdu, Peoples R China
[2] State Grid Sichuan Elect Power Co, Elect Power Sci Res Inst, Chengdu, Peoples R China
[3] Power Internet Things Key Lab Sichuan Prov, Chengdu, Peoples R China
关键词
Electric vehicles; Markov decision process; Deep reinforcement learning; Soft actor-critic; Charging and discharging scheduling; COORDINATION;
D O I
10.1016/j.egyr.2024.10.056
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
Grid security is threatened by the uncontrolled access of large-scale electric vehicles (EVs) to the grid. The scheduling problem of EVs charging and discharging is described as a Markov decision process (MDP) to develop an efficient charging and discharging scheduling strategy. Furthermore, a deep reinforcement learning (DRL)-based model-free method is suggested to address such issue. The proposed method aims to enhance EVs charging and discharging profits while reducing drivers' electricity anxiety and ensuring grid security. Drivers' electricity anxiety is described by fuzzy mathematical theory, and analyze the effect of EV current power and remaining charging time on it. Variable electricity pricing is calculated by real-time residential load. A dynamic charging environment is constructed considering the stochasticity of electricity prices, driver's behavior, and residential load. A soft actor-critic (SAC) framework is used to train the agent, which learns the optimal charging and discharging scheduling strategies by interacting with the dynamic charging environment. Finally, simulation with actual data is used to verify that the suggested approach can reduce drivers' charging costs and electricity anxiety while avoiding transformer overload.
引用
收藏
页码:4854 / 4863
页数:10
相关论文
共 43 条
  • [1] Distributed Electric Vehicles Charging Management Considering Time Anxiety and Customer Behaviors
    Alsabbagh, Amro
    Wu, Brian
    Ma, Chengbin
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (04) : 2422 - 2431
  • [2] Pathway toward carbon-neutral electrical systems in China by mid-century with negative CO2 abatement costs informed by high-resolution modeling
    Chen, Xinyu
    Liu, Yaxing
    Wang, Qin
    Lv, Jiajun
    Wen, Jinyu
    Chen, Xia
    Kang, Chongqing
    Cheng, Shijie
    McElroy, Michael B.
    [J]. JOULE, 2021, 5 (10) : 2715 - 2741
  • [3] A Multiagent Federated Reinforcement Learning Approach for Plug-In Electric Vehicle Fleet Charging Coordination in a Residential Community
    Chu, Yunfei
    Wei, Zhinong
    Fang, Xicheng
    Chen, Sheng
    Zhou, Yizhou
    [J]. IEEE ACCESS, 2022, 10 : 98535 - 98548
  • [4] Real-time multi-objective optimisation for electric vehicle charging management
    Das, Ridoy
    Wang, Yue
    Busawon, Krishna
    Putrus, Ghanim
    Neaimeh, Myriam
    [J]. JOURNAL OF CLEANER PRODUCTION, 2021, 292
  • [5] Optimal Electric Vehicle Charging Strategy With Markov Decision Process and Reinforcement Learning Technique
    Ding, Tao
    Zeng, Ziyu
    Bai, Jiawen
    Qin, Boyu
    Yang, Yongheng
    Shahidehpour, Mohammad
    [J]. IEEE TRANSACTIONS ON INDUSTRY APPLICATIONS, 2020, 56 (05) : 5811 - 5823
  • [6] Fujimoto S., 2018, Address Funct. Approx. Error Actor-Crit. Methods
  • [7] Optimized Scheduling of EV Charging in Solar Parking Lots for Local Peak Reduction under EV Demand Uncertainty
    Ghotge, Rishabh
    Snow, Yitzhak
    Farahani, Samira
    Lukszo, Zofia
    van Wijk, Ad
    [J]. ENERGIES, 2020, 13 (05)
  • [8] GRIDWATCH, About Us
  • [9] Hou ZM, 2020, Arxiv, DOI arXiv:2002.02829
  • [10] A Real-Time EV Charging Scheduling for Parking Lots With PV System and Energy Store System
    Jiang, Wei
    Zhen, Yongqi
    [J]. IEEE ACCESS, 2019, 7 : 86184 - 86193