Neural Network and Reinforcement Learning based Energy Management Strategy for Battery/Supercapacitor HEV

被引:0
作者
Tao, Jili [1 ]
Xu, Zejiang [2 ]
Ma, Longhua [1 ]
Tian, Guanzhong [1 ]
Wu, Chengyu [1 ]
机构
[1] NingboTech Univ, Sch Informat Sci & Engn, Ningbo, Peoples R China
[2] Zhejiang Sci Tech Univ, Sch Informat Sci & Technol, Hangzhou, Peoples R China
来源
2024 IEEE 19TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, ICIEA 2024 | 2024年
基金
中国国家自然科学基金;
关键词
Energy management strategy; neural network; reinforcement learning; model predictive control;
D O I
10.1109/ICIEA61579.2024.10665301
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A novel energy management strategy based on neural network (NN) and reinforcement learning is proposed to split the power of lithium battery and supercapacitor in the hybrid energy vehicle (HEV), which can meet the power required by the vehicle driving, improve the battery lifetime. First, the NN state-action model is gained by using traditional energy management controller, it generates new data and combined with the original data to optimize the NN model. Then, the action sequences are generated randomly, which will be evaluated by NN model, here, only the first action is adopted. The next action is re-predicted after this action is executed, which reduces the cumulative error caused by the inaccurate model. Finally, four typical driving patterns are selected, and the comparison with Deep Q-network (DQN) method is carried out, which verifies the performance of energy management strategy based on NN and reinforcement learning.
引用
收藏
页数:5
相关论文
共 50 条
[21]   An Optimal Control-Based Strategy for Energy Management of Electric Vehicles using Battery/Supercapacitor [J].
Bao-Huy Nguyen ;
Trovao, Joao P. ;
German, Ronan ;
Bouscayrol, Alain .
2017 IEEE VEHICLE POWER AND PROPULSION CONFERENCE (VPPC), 2017,
[22]   Adaptive energy management strategy and optimal sizing applied on a battery-supercapacitor based tramway [J].
Herrera, Victor ;
Milo, Aitor ;
Gaztanaga, Haizea ;
Etxeberria-Otadui, Ion ;
Villarreal, Igor ;
Camblong, Haritza .
APPLIED ENERGY, 2016, 169 :831-845
[23]   Battery life constrained real-time energy management strategy for hybrid electric vehicles based on reinforcement learning [J].
Han, Lijin ;
Yang, Ke ;
Ma, Tian ;
Yang, Ningkang ;
Liu, Hui ;
Guo, Lingxiong .
ENERGY, 2022, 259
[24]   Data-driven reinforcement-learning-based hierarchical energy management strategy for fuel cell/battery/ultracapacitor hybrid electric vehicles [J].
Sun, Haochen ;
Fu, Zhumu ;
Tao, Fazhan ;
Zhu, Longlong ;
Si, Pengju .
JOURNAL OF POWER SOURCES, 2020, 455
[25]   Battery-degradation-involved energy management strategy based on deep reinforcement learning for fuel cell/battery/ultracapacitor hybrid electric vehicle [J].
Lu, Hongxin ;
Tao, Fazhan ;
Fu, Zhumu ;
Sun, Haochen .
ELECTRIC POWER SYSTEMS RESEARCH, 2023, 220
[26]   A Deep Reinforcement Learning-Based Energy Management Strategy for Fuel Cell Hybrid Buses [J].
Chunhua Zheng ;
Wei Li ;
Weimin Li ;
Kun Xu ;
Lei Peng ;
Suk Won Cha .
International Journal of Precision Engineering and Manufacturing-Green Technology, 2022, 9 :885-897
[27]   Research on Efficiency Optimization Based Energy Management Strategy for a Hybrid Electric Vehicle with Reinforcement Learning [J].
Yang N. ;
Han L. ;
Liu H. ;
Zhang X. .
Qiche Gongcheng/Automotive Engineering, 2021, 43 (07) :1046-1056
[28]   A Deep Reinforcement Learning-Based Energy Management Strategy for Fuel Cell Hybrid Buses [J].
Zheng, Chunhua ;
Li, Wei ;
Li, Weimin ;
Xu, Kun ;
Peng, Lei ;
Cha, Suk Won .
INTERNATIONAL JOURNAL OF PRECISION ENGINEERING AND MANUFACTURING-GREEN TECHNOLOGY, 2022, 9 (03) :885-897
[29]   Strategy Acquisition for Games Based on Simplified Reinforcement Learning Using a Strategy Network [J].
Kanakubo, Masaaki ;
Hagiwara, Masafumi .
JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2005, 9 (02) :203-210
[30]   Demand Response for Home Energy Management Using Reinforcement Learning and Artificial Neural Network [J].
Lu, Renzhi ;
Hong, Seung Ho ;
Yu, Mengmeng .
IEEE TRANSACTIONS ON SMART GRID, 2019, 10 (06) :6629-6639