An Energy-Efficient Driving Method for Connected and Automated Vehicles Based on Reinforcement Learning

被引:3
作者
Min, Haitao [1 ]
Xiong, Xiaoyong [1 ]
Yang, Fang [2 ]
Sun, Weiyi [1 ]
Yu, Yuanbin [1 ]
Wang, Pengyu [1 ]
机构
[1] Jilin Univ, State Key Lab Automot Simulat & Control, Changchun 130012, Peoples R China
[2] China FAW Corp Ltd, Gen Res & Dev Inst, Changchun 130013, Peoples R China
基金
中国国家自然科学基金;
关键词
connected and automated vehicles; energy-efficient driving; reinforcement learning; long short-term memory; proximal policy optimization; MODEL-PREDICTIVE CONTROL; ELECTRIC VEHICLES;
D O I
10.3390/machines11020168
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The development of connected and automated vehicles (CAV) technology not only helps to reduce traffic accidents and improve traffic efficiency, but also has significant potential for energy saving and emission reduction. Using the dynamic traffic flow information around the vehicle to optimize the vehicle trajectory is conducive to improving the energy efficiency of the vehicle. Therefore, an energy-efficient driving method for CAVs based on reinforcement learning is proposed in this paper. Firstly, a set of vehicle trajectory prediction models based on long and short-term memory (LSTM) neural networks are developed, which integrate driving intention prediction and lane change time prediction to improve the prediction accuracy of surrounding vehicle trajectories. Secondly, an energy-efficient driving model is built based on Proximity Policy Optimization (PPO) reinforcement learning. The model takes the current states and predicted trajectories of surrounding vehicles as input information, and outputs energy-saving control variables while taking into account various constraints, such as safety, comfort, and travel efficiency. Finally, the method is tested by simulation on the NGSIM dataset, and the results show that the proposed method can save energy consumption by 9-22%.
引用
收藏
页数:20
相关论文
共 44 条
[21]   Data-driven based eco-driving control for plug-in hybrid electric vehicles [J].
Li, Jie ;
Liu, Yonggang ;
Zhang, Yuanjian ;
Lei, Zhenzhen ;
Chen, Zheng ;
Li, Guang .
JOURNAL OF POWER SOURCES, 2021, 498
[22]   An Optimal Velocity-Planning Scheme for Vehicle Energy Efficiency Through Probabilistic Prediction of Traffic-Signal Timing [J].
Mahler, Grant ;
Vahidi, Ardalan .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2014, 15 (06) :2516-2523
[23]   Fuel economy improvements for urban driving: Hybrid vs. intelligent vehicles [J].
Manzie, Chris ;
Watson, Harry ;
Halgamuge, Saman .
TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2007, 15 (01) :1-16
[24]  
McDonough K, 2013, P AMER CONTR CONF, P1350
[25]  
Messaoud K, 2019, IEEE INT VEH SYM, P975, DOI [10.1109/ivs.2019.8813829, 10.1109/IVS.2019.8813829]
[26]  
Mnih V, 2016, PR MACH LEARN RES, V48
[27]   Adaptive Cruise Control for Eco-Driving Based on Model Predictive Control Algorithm [J].
Nie, Zifei ;
Farzaneh, Hooman .
APPLIED SCIENCES-BASEL, 2020, 10 (15)
[28]   Energy-optimal adaptive cruise control strategy for electric vehicles based on model predictive control [J].
Pan, Chaofeng ;
Huang, Aibao ;
Wang, Jian ;
Chen, Liao ;
Liang, Jun ;
Zhou, Weiqi ;
Wang, Limei ;
Yang, Jufeng .
ENERGY, 2022, 241
[29]   Acceptability and Acceptance of Connected Automated Vehicles: A Literature Review and Focus Groups [J].
Post, Jorick M. M. ;
Veldstra, Janna L. ;
Unal, A. Berfu .
PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON COMPUTER-HUMAN INTERACTION RESEARCH AND APPLICATIONS (CHIRA), 2021, :223-231
[30]  
Schulman J, 2017, Arxiv, DOI arXiv:1707.06347