Reinforcement Learning Based Multi-objective Eco-driving Strategy in Urban Scenarios

被引:0
作者
Li J. [1 ]
Wu X. [1 ]
Xu M. [1 ]
Liu Y. [2 ]
机构
[1] School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai
[2] Chongqing University, State Key Laboratory of Mechanical Transmission, Chongqing
来源
Qiche Gongcheng/Automotive Engineering | 2023年 / 45卷 / 10期
关键词
connected and automated vehicle; deep reinforcement learning; eco-driving; multi-objective optimization; urban traffic scenario;
D O I
10.19562/j.chinasae.qcgc.2023.10.002
中图分类号
学科分类号
摘要
To improve the ride experience of connected and automated vehicle in complex urban traffic scenarios,this paper proposes a deep reinforcement learning based multi-objective eco-driving strategy that considers driving safety,energy economy,ride comfort,and travel efficiency. Firstly,the state space,action space,and multi-objective reward function of the eco-driving strategy are constructed based on the Markov decision process. Secondly,the car-following safety model and traffic light safety model are designed to provide safety speed suggestion for the eco-driving strategy. Thirdly,the composite multi-objective reward function design method that integrates safety constraints and shaping functions is proposed to ensure training convergence and optimization performance of the deep reinforcement learning agent. Finally,the effectiveness of the proposed method is verified through hardware-in-the-loop experiments. The results show that the proposed strategy can be applied in real-time on the onboard vehicle control unit. Compared to the eco-driving strategy based on the intelligent driver model,the proposed strategy improves energy economy,ride comfort,and travel efficiency of the vehicle while satisfying the driving safety constraints. © 2023 SAE-China. All rights reserved.
引用
收藏
页码:1791 / 1802
页数:11
相关论文
共 29 条
  • [1] HUANG Y, NG E C, ZHOU J L, Et al., Eco-driving technology for sustainable road transport:a review[J], Renewable and Sustainable Energy Reviews, 93, pp. 596-609, (2018)
  • [2] ZHANG F, HU X, LANGARI R, Et al., Energy management strategies of connected HEVs and PHEVs:recent progress and outlook, Progress in Energy and Combustion Science, 73, pp. 235-256, (2019)
  • [3] QIAN L, CHEN J, WU B, Et al., Eco⁃driving control for hybrid electric vehicle platoon with consideration of driver operation error, Automotive Engineering, 43, 7, (2021)
  • [4] WEI X, LIU B, LENG J, Et al., Research on eco-driving of fuel cell vehicles via convex optimization, Automotive Engineering, 44, 6, (2022)
  • [5] KIM J, AHN C., Real-time speed trajectory planning for minimum fuel consumption of a ground vehicle, IEEE Transactions on Intelligent Transportation Systems, 21, 6, pp. 2324-2338, (2020)
  • [6] LI S E, PENG H, LI K, Et al., Minimum fuel control strategy in automated car-following scenarios[J], IEEE Transactions on Vehicular Technology, 61, 3, pp. 998-1007, (2012)
  • [7] KAMALANATHSHARMA R K, RAKHA H A., Leveraging connected vehicle technology and telematics to enhance vehicle fuel efficiency in the vicinity of signalized intersections, Journal of Intelligent Transportation Systems, 20, 1, pp. 1-12, (2014)
  • [8] WU J, ZOU Y, ZHANG X, Et al., A hierarchical energy management for hybrid electric tracked vehicle considering velocity planning with pseudospectral method[J], IEEE Transactions on Transportation Electrification, 6, 2, pp. 703-716, (2020)
  • [9] SHAO Y, SUN Z., Eco-approach with traffic prediction and experimental validation for connected and autonomous vehicles, IEEE Transactions on Intelligent Transportation Systems, 22, 3, pp. 1562-1572, (2021)
  • [10] LI L, WANG X, SONG J., Fuel consumption optimization for smart hybrid electric vehicle during a car-following process, Mechanical Systems and Signal Processing, 87, pp. 17-29, (2017)