Multiassistant Knowledge Distillation for Lightweight Bearing Fault Diagnosis Based on Decreasing Threshold Channel Pruning

被引:2
|
作者
Zhong, Hongyu [1 ,2 ,3 ]
Yu, Samson [3 ]
Trinh, Hieu [3 ]
Lv, Yong [1 ,2 ]
Yuan, Rui [1 ,2 ]
Wang, Yanan [3 ]
机构
[1] Wuhan Univ Sci & Technol, Key Lab Met Equipment & Control Technol, Minist Educ, Wuhan 430081, Peoples R China
[2] Wuhan Univ Sci & Technol, Hubei Key Lab Mech Transmiss & Mfg Engn, Wuhan 430081, Peoples R China
[3] Deakin Univ, Sch Engn, Geelong, Vic 3216, Australia
基金
中国国家自然科学基金;
关键词
Channel pruning; intelligent fault diagnosis; knowledge distillation (KD); wavelet transform; NEURAL-NETWORKS;
D O I
10.1109/JSEN.2023.3332653
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Bearing fault detection and classification under a diagnostics model with fewer parameters has been a challenging problem. A common solution is knowledge distillation (KD) using teacher-student models. Through the distillation process, the student model can acquire knowledge from the teacher model to enhance performance without introducing extra parameters. However, when using a powerful teacher model, distillation performance is not always ideal. This is because a more powerful teacher model can generate more specific classification strategies, which may result in poorer distillation performance. To this end, the multiassistant KD (MAKD) method is proposed, which bridges the gap between the teacher-student models by incorporating several intermediate-sized assistant models (AMs). Moreover, these AMs have the same architecture, which creates a better knowledge transfer condition at the logit layer. To further optimize the network structure to improve the distillation performance, decreasing threshold channel pruning (DTCP) is proposed to generate better AMs. DTCP leverages the scatter value of the decreasing function to prune the channels of the teacher model, which retains more channels that are beneficial to distillation. Finally, four-class and ten-class classification experiments are conducted on two bearing datasets. The experimental results demonstrate that the proposed DTCP-MAKD method improves distillation performance and outperforms other state-of-the-art KD methods.
引用
收藏
页码:486 / 494
页数:9
相关论文
共 50 条
  • [1] Lightweight Knowledge Distillation-Based Transfer Learning Framework for Rolling Bearing Fault Diagnosis
    Lu, Ruijia
    Liu, Shuzhi
    Gong, Zisu
    Xu, Chengcheng
    Ma, Zonghe
    Zhong, Yiqi
    Li, Baojian
    SENSORS, 2024, 24 (06)
  • [2] A Lightweight Network With Adaptive Input and Adaptive Channel Pruning Strategy for Bearing Fault Diagnosis
    Liu, Lei
    Cheng, Yao
    Song, Dongli
    Zhang, Weihua
    Tang, Guiting
    Luo, Yaping
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 11
  • [3] Lightweight Edge-side Fault Diagnosis Based on Knowledge Distillation
    Shang, Yingjun
    Feng, Tao
    Huo, Yonghua
    Duan, Yongcun
    Long, Yuhan
    2022 IEEE 14TH INTERNATIONAL CONFERENCE ON ADVANCED INFOCOMM TECHNOLOGY (ICAIT 2022), 2022, : 348 - 353
  • [4] Lightweight fault diagnosis method in embedded system based on knowledge distillation
    Gong, Ran
    Wang, Chenlin
    Li, Jinxiao
    Xu, Yi
    JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, 2023, 37 (11) : 5649 - 5660
  • [5] Lightweight fault diagnosis method in embedded system based on knowledge distillation
    Ran Gong
    Chenlin Wang
    Jinxiao Li
    Yi Xu
    Journal of Mechanical Science and Technology, 2023, 37 : 5649 - 5660
  • [6] A Lightweight and Small Sample Bearing Fault Diagnosis Algorithm Based on Probabilistic Decoupling Knowledge Distillation and Meta-Learning
    Luo, Hao
    Ren, Tongli
    Zhang, Ying
    Zhang, Li
    SENSORS, 2024, 24 (24)
  • [7] Lightweight intelligent fault diagnosis method based on a multi-stage pruning distillation interleaving network
    Ren, Linlin
    Li, Xiaoming
    Ma, Hongbo
    Zhang, Guowei
    Huang, Song
    Chen, Ke
    Wang, Xiaoqing
    Yue, Weijie
    ADVANCES IN MECHANICAL ENGINEERING, 2024, 16 (09)
  • [8] Applied Research on Bearing Fault Diagnosis Based on Knowledge Distillation and Transfer Learning
    Wang, Tingxuan
    Liu, Tao
    Wang, Zhenya
    Pu, Huijie
    Computer Engineering and Applications, 2023, 59 (13) : 289 - 297
  • [9] Application of variable temperature gradient TOP-K knowledge distillation with model pruning in lightweight fault diagnosis for bearings
    Cui, Ze
    Yang, Qishuang
    Xiong, Zixiang
    Gu, Rongyang
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (02)
  • [10] Network lightweight method based on knowledge distillation is applied to RV reducer fault diagnosis
    He, Feifei
    Liu, Chang
    Wang, Mengdi
    Yang, Enshan
    Liu, Xiaoqin
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (09)