Hierarchical Energy Management Recognizing Powertrain Dynamics for Electrified Vehicles With Deep Reinforcement Learning and Transfer Learning

被引:0
作者
Wang, Hao [1 ]
Biswas, Atriya [2 ]
Ahmed, Ryan [1 ]
Yan, Fengjun [1 ]
Emadi, Ali [1 ]
机构
[1] McMaster Univ, McMaster Automot Resource Ctr, Hamilton, ON L8P 0A6, Canada
[2] Indian Inst Technol Madras IIT Madras IITM, Chennai 600036, Tamil Nadu, India
基金
加拿大自然科学与工程研究理事会;
关键词
Mechanical power transmission; Torque; Energy management; Hybrid electric vehicles; Vehicle dynamics; Engines; Biological system modeling; Deep reinforcement learning (DRL); electrified vehicle; energy management strategy (EMS); powertrain dynamics; transfer learning; HYBRID; STRATEGY;
D O I
10.1109/TTE.2024.3442689
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep reinforcement learning (DRL)-based energy management strategies (EMSs) have gained significant popularity in improving the performance of electrified vehicles. Typically, these EMSs are trained and validated in simulated environments. However, this article reveals that the environment's fidelity significantly impacts DRL-based EMSs' performance. Specifically, the EMSs optimized within low-fidelity environments (LFEs)-prevalent in literature yet lacking detailed powertrain dynamics-suffer a performance drop of 2%-3% in energy economy when tested in high-fidelity environments (HFEs) that incorporate powertrain dynamics. To address it, a DRL-based hierarchical energy management framework for multimode power-split hybrid electric vehicles (HEVs) is proposed. It recognizes powertrain dynamics and facilitates transfer learning techniques to bridge the performance gap between LFEs and HFEs. In the upper level of this framework, a DRL agent determines the optimal timing to activate the hybrid mode and the optimal engine operation. The lower level optimizes the torque distribution between two electric motors for all-electric modes. Simulation results demonstrate that the proposed DRL-based EMS, enhanced by transfer learning, reduces training time by approximately 40% compared with the trained-from-scratch EMS within an HFE. Moreover, the proposed EMS achieves 98% energy economy of the optimal benchmark, addressing the noted performance degradation and exhibiting consistent performance in adaptability tests.
引用
收藏
页码:3466 / 3479
页数:14
相关论文
共 26 条
[1]   A Review and New Problems Discovery of Four Simple Decentralized Maximum Power Point Tracking AlgorithmsPerturb and Observe, Incremental Conductance, Golden Section Search, and Newton's Quadratic Interpolation [J].
Andrean, Victor ;
Chang, Pei Cheng ;
Lian, Kuo Lung .
ENERGIES, 2018, 11 (11)
[2]   Effect of coordinated control on real-time optimal mode selection for multi-mode hybrid electric powertrain? [J].
Biswas, Atriya ;
Anselma, Pier Giuseppe ;
Rathore, Aashit ;
Emadi, Ali .
APPLIED ENERGY, 2021, 289 (289)
[3]   Integrated velocity optimization and energy management for FCHEV: An eco-driving approach based on deep reinforcement learning [J].
Chen, Weiqi ;
Peng, Jiankun ;
Ren, Tinghui ;
Zhang, Hailong ;
He, Hongwen ;
Ma, Chunye .
ENERGY CONVERSION AND MANAGEMENT, 2023, 296
[4]   Energy Management Strategy for Fuel Cell/Battery/Ultracapacitor Hybrid Electric Vehicles Using Deep Reinforcement Learning With Action Trimming [J].
Fu, Zhumu ;
Wang, Haocong ;
Tao, Fazhan ;
Ji, Baofeng ;
Dong, Yongsheng ;
Song, Shuzhong .
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (07) :7171-7185
[5]   Intelligent Learning Algorithm and Intelligent Transportation-Based Energy Management Strategies for Hybrid Electric Vehicles: A Review [J].
Gan, Jiongpeng ;
Li, Shen ;
Wei, Chongfeng ;
Deng, Lei ;
Tang, Xiaolin .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (10) :10345-10361
[6]   A review of reinforcement learning based energy management systems for electrified powertrains: Progress, challenge, and potential solution [J].
Ganesh, Akhil Hannegudda ;
Xu, Bin .
RENEWABLE & SUSTAINABLE ENERGY REVIEWS, 2022, 154
[7]   Hybrid deep reinforcement learning based eco-driving for low-level connected and automated vehicles along signalized corridors [J].
Guo, Qiangqiang ;
Angah, Ohay ;
Liu, Zhijun ;
Ban, Xuegang .
TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2021, 124
[8]  
Haarnoja T, 2019, Arxiv, DOI arXiv:1812.05905
[9]   Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives [J].
He, Hongwen ;
Meng, Xiangfei ;
Wang, Yong ;
Khajepour, Amir ;
An, Xiaowen ;
Wang, Renguang ;
Sun, Fengchun .
RENEWABLE & SUSTAINABLE ENERGY REVIEWS, 2024, 192
[10]   Naturalistic data-driven and emission reduction-conscious energy management for hybrid electric vehicle based on improved soft actor-critic algorithm [J].
Huang, Ruchen ;
He, Hongwen .
JOURNAL OF POWER SOURCES, 2023, 559