Accurate Ferrite Core Loss Model Based on CNN-BiLSTM and Few-shot Transfer Learning Prediction Method

被引:0
作者
Liu, Zhanlei [1 ]
Zhu, Lingyu [1 ]
Zhan, Cao [2 ]
Dang, Yongliang [1 ]
Zhang, Yukun [1 ]
Ji, Shengchang [1 ]
机构
[1] State Key Laboratory of Electrical Insulation and Power Equipment, Xi’an Jiaotong University, Xi’an
[2] Center for Power Electronics Systems (CPES), Virginia Tech, Blacksburg, 24061, VA
来源
Gaodianya Jishu/High Voltage Engineering | 2024年 / 50卷 / 10期
关键词
CNN-BiLSTM; core loss; ferrite; few-shot dataset; transfer learning;
D O I
10.13336/j.1003-6520.hve.20240847
中图分类号
学科分类号
摘要
Traditional empirical formulas and loss separation formulas cannot accurately calculate ferrite core loss under wide frequency, wide flux density, wide temperature range, and complex waveform excitations. Considering the dependence of core loss on both the local and long-term characteristics of flux density waveform and utilizing the MagNet dataset built by researchers in Princeton University, we established a large-sample core loss pre-training model based on CNN-BiLSTM. The average prediction errors of core losses are all below 3% and the 95% errors are all below 10%. The 3C90 and N87 are taken as examples, few-shot core loss datasets are established, and transfer learning method is applied to train the model. The optimal transfer learning strategy is selected and optimal source model selection method are proposed. The required training steps of transfer learning and direct training are compared. The impacts of few-shot data size and initial learning rate on the transfer learning effect are analyzed. A sample size of 1 000 is taken as an example and compared with direct training, the required training steps are reduced from 500 to 50 by adopting transfer learning. The average prediction errors of 3C90 and N87 ferrite core losses are reduced from 4.49% and 6.6% to 2.66% and 2.35% respectively. The 95 percentile prediction errors are reduced from 11.97% and 17.12% to 7.22% and 6.21%, respectively. Both the convergence speed and prediction accuracy of the model are improved. In practical engineering, only few-shot dataset is required to fine-tune the parameters in the source domain model to realize fast model solving and accurate core loss prediction. © 2024 Science Press. All rights reserved.
引用
收藏
页码:4487 / 4498
页数:11
相关论文
共 39 条
[31]  
TAN Y D, ZHAO G C., Transfer learning with long short-term memory network for state-of-health prediction of lithium-ion batteries, IEEE Transactions on Industrial Electronics, 67, 10, pp. 8723-8731, (2020)
[32]  
DOGARIU E, LI H R, SERRANO LOPEZ D, Et al., Transfer learning methods for magnetic core loss modeling, 2021 IEEE 22nd Workshop on Control and Modelling of Power Electronics (COMPEL), pp. 1-6, (2021)
[33]  
ZHOU Feiyan, JIN Linpeng, DONG Jun, Review of convolutional neural network, Chinese Journal of Computers, 40, 6, pp. 1229-1251, (2017)
[34]  
ZOU Zhi, WU Tiezhou, ZHANG Xiaoxing, Et al., Short-term load forecast based on Bayesian optimized CNN-BiGRU hybrid neural networks, High Voltage Engineering, 48, 10, pp. 3935-3945, (2022)
[35]  
WANG Yuhong, SHI Yunxiang, ZHOU Xu, Et al., Ultra-short-term power prediction for BiLSTM multi wind turbines based on temporal pattern attention, High Voltage Engineering, 48, 5, pp. 1884-1892, (2022)
[36]  
KONG F H, LI J Q, JIANG B, Et al., Integrated generative model for industrial anomaly detection via bidirectional LSTM and attention mechanism, IEEE Transactions on Industrial Informatics, 19, 1, pp. 541-550, (2023)
[37]  
YANG Li, WU Yuxi, WANG Junli, Et al., Research on recurrent neural network, Journal of Computer Applications, 38, pp. 1-6, (2018)
[38]  
CHEMALI E, KOLLMEYER P J, PREINDL M, Et al., Long short-term memory networks for accurate state-of-charge estimation of Li-ion batteries, IEEE Transactions on Industrial Electronics, 65, 8, pp. 6730-6739, (2018)
[39]  
FAN Zhiyuan, DU Jiang, Prediction of dissolved gas volume fraction in transformer oil based on correlation variational mode decomposition and CNN-LSTM, High Voltage Engineering, 50, 1, pp. 263-273, (2024)