Energy Management for a Hybrid Electric Vehicle Based on Blended Reinforcement Learning With Backward Focusing and Prioritized Sweeping

被引:51
作者
Yang, Ningkang [1 ,2 ]
Han, Lijin [1 ,2 ]
Xiang, Changle [1 ,2 ]
Liu, Hui [1 ,2 ]
Hou, Xuzhao [1 ,2 ]
机构
[1] Beijing Inst Technol, Sch Mech Engn, Beijing 100081, Peoples R China
[2] Beijing Inst Technol, Natl Key Lab Vehicular Transmiss, Beijing 100081, Peoples R China
基金
中国国家自然科学基金;
关键词
State of charge; Batteries; Engines; Energy management; Hybrid electric vehicles; Resistance; Mechanical power transmission; Hybrid electric vehicle; energy management; blended reinforcement learning; queue-Dyna; STRATEGY;
D O I
10.1109/TVT.2021.3064407
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
As a fundamental task of hybrid electric vehicles (HEVs), energy management strategies are critical for improving the performance. This paper proposes a new queue-Dyna reinforcement learning (RL) algorithm based energy management strategy (EMS), which can substantially reduce the online learning time while guarantees the control performance compared with widely used Q-learning. To solve the existing problems of direct and indirect RL based EMSs, a blended RL algorithm, Dyna, is introduced first. By reusing the actual experience to construct a model online, Dyna integrates direct and indirect RL, and thus has the advantages of both. Furthermore, two novel strategies of backward focusing and prioritized sweeping are incorporated in the Dyna framework, developing the queue-Dyna algorithm. To the best of our knowledge, it's the first attempt of adopting queue-Dyna in the EMS of a HEV. A comparative simulation of direct RL, indirect RL, Dyna and queue-Dyna is implemented, and the results demonstrate the proposed algorithm achieves a great improvement in fast learning and maintains satisfied fuel consumption. At last, a hardware-in-the-loop experiment verified the real-time performance of the proposed EMS.
引用
收藏
页码:3136 / 3148
页数:13
相关论文
共 25 条
[11]   Deep reinforcement learning enabled self-learning control for energy efficient driving [J].
Qi, Xuewei ;
Luo, Yadan ;
Wu, Guoyuan ;
Boriboonsomsin, Kanok ;
Barth, Matthew .
TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2019, 99 :67-81
[12]   An Intelligent Power and Energy Management System for Fuel Cell/Battery Hybrid Electric Vehicle using Reinforcement Learning [J].
Reddy, Namireddy Praveen ;
Pasdeloup, David ;
Zadeh, Mehdi Karbalaye ;
Skjetne, Roger .
2019 IEEE TRANSPORTATION ELECTRIFICATION CONFERENCE AND EXPO (ITEC), 2019,
[13]  
Ruan S, 2020, COMPLEXITY, V2020, P1
[14]   A Comparative Analysis of Energy Management Strategies for Hybrid Electric Vehicles [J].
Serrao, Lorenzo ;
Onori, Simona ;
Rizzoni, Giorgio .
JOURNAL OF DYNAMIC SYSTEMS MEASUREMENT AND CONTROL-TRANSACTIONS OF THE ASME, 2011, 133 (03)
[15]  
Sutton R. S., 1990, Machine Learning: Proceedings of the Seventh International Conference (1990), P216
[16]  
Sutton R.S., 1991, ACM SIGART Bulletin, V2, P160, DOI [DOI 10.1145/122344.122377, DOI 10.1145/122344.122377.6]
[17]  
Sutton RS, 2018, ADAPT COMPUT MACH LE, P1
[18]   Optimal Energy Management and Velocity Control of Hybrid Electric Vehicles [J].
Uebel, Stephan ;
Murgovski, Nikolce ;
Tempelhahn, Conny ;
Baeker, Bernard .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2018, 67 (01) :327-337
[19]   Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus [J].
Wu, Jingda ;
He, Hongwen ;
Peng, Jiankun ;
Li, Yuecheng ;
Li, Zhanjiang .
APPLIED ENERGY, 2018, 222 :799-811
[20]   Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus [J].
Wu, Yuankai ;
Tan, Huachun ;
Peng, Jiankun ;
Zhang, Hailong ;
He, Hongwen .
APPLIED ENERGY, 2019, 247 :454-466