A multi-head attention mechanism aided hybrid network for identifying batteries' state of charge

被引:16
|
作者
Li, Zongxiang [1 ]
Li, Liwei [2 ]
Chen, Jing [3 ]
Wang, Dongqing [1 ]
机构
[1] Qingdao Univ, Coll Elect Engn, Qingdao 266071, Peoples R China
[2] Shandong Univ, Sch Control Sci & Engn, Jinan 250061, Peoples R China
[3] Jiangnan Univ, Sch Sci, Wuxi 214122, Peoples R China
基金
中国国家自然科学基金;
关键词
Lithium -ion batteries; State of charge; Adaptive noise based complete ensemble; empirical mode decomposition (AN-CEEMD); CNN-BiLSTM network; Multi -head attention mechanism; LITHIUM-ION BATTERIES; MACHINE;
D O I
10.1016/j.energy.2023.129504
中图分类号
O414.1 [热力学];
学科分类号
摘要
A Convolution Neural Network and a Bidirectional Long Short-Term Memory network connected architecture with a Multi-Head Attention mechanism (CNN-BiLSTM-MHA) is studied for predicting state of charge (SOC) of lithium-ion batteries (LIBs). Firstly, an adaptive noise based complete ensemble empirical mode decomposition (AN-CEEMD) algorithm is adopted to catch the intrinsic features of measured battery signals by adding white noises. Secondly, a CNN-BiLSTM model with the MHA mechanism is developed to learn the mapping between processed input signals and battery SOC, it has three parts: 1) the CNN extracts features of the processed data, mine the relation among input signals, and promote estimation precision; 2) the BiLSTM has memory ability to catch battery dynamics, and the Swish activation function in the BiLSTM ensures unsaturated and reduces over -fitting due to its upper unbound and lower bound; 3) The multi-head attention mechanism uses several inde-pendent self-attention layers to associate input information and variables, extracts more associate information by adding weights; it reduces the overfitting risk of a single attention head, and improves the model generalization performance through joint learning of multiple heads. Finally, experiments and simulations are implemented under four operating conditions at five different temperatures, and the presented method is verified effective.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Prediction of Lithium Battery Voltage and State of Charge Using Multi-Head Attention BiLSTM Neural Network
    Xi, Haiwen
    Lv, Taolin
    Qin, Jincheng
    Ma, Mingsheng
    Xie, Jingying
    Lu, Shigang
    Liu, Zhifu
    APPLIED SCIENCES-BASEL, 2025, 15 (06):
  • [2] A Network Intrusion Detection Model Based on BiLSTM with Multi-Head Attention Mechanism
    Zhang, Jingqi
    Zhang, Xin
    Liu, Zhaojun
    Fu, Fa
    Jiao, Yihan
    Xu, Fei
    ELECTRONICS, 2023, 12 (19)
  • [3] A facial depression recognition method based on hybrid multi-head cross attention network
    Li, Yutong
    Liu, Zhenyu
    Zhou, Li
    Yuan, Xiaoyan
    Shangguan, Zixuan
    Hu, Xiping
    Hu, Bin
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [4] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [5] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [6] Network Configuration Entity Extraction Method Based on Transformer with Multi-Head Attention Mechanism
    Yang, Yang
    Qu, Zhenying
    Yan, Zefan
    Gao, Zhipeng
    Wang, Ti
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (01): : 735 - 757
  • [7] A Graph Neural Network Social Recommendation Algorithm Integrating the Multi-Head Attention Mechanism
    Yi, Huawei
    Liu, Jingtong
    Xu, Wenqian
    Li, Xiaohui
    Qian, Huihui
    ELECTRONICS, 2023, 12 (06)
  • [8] Structural acceleration response reconstruction based on BiLSTM network and multi-head attention mechanism
    Wang, Zifeng
    Peng, Zhenrui
    STRUCTURES, 2024, 64
  • [9] On the diversity of multi-head attention
    Li, Jian
    Wang, Xing
    Tu, Zhaopeng
    Lyu, Michael R.
    NEUROCOMPUTING, 2021, 454 : 14 - 24
  • [10] A fiber recognition framework based on multi-head attention mechanism
    Xu, Luoli
    Li, Fenying
    Chang, Shan
    TEXTILE RESEARCH JOURNAL, 2024, 94 (23-24) : 2629 - 2640