Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation

被引:0
|
作者
Guo, Sheng [1 ]
Chen, Hui [2 ]
Liu, Yang [3 ]
Yang, Chengyi [2 ]
Li, Zengxiang [2 ]
Jin, Cheng Hao [4 ]
机构
[1] Luculent Smart Technol Co Ltd, Beijing 100000, Peoples R China
[2] Enn Grp, Digital Res Inst, Langfang 065000, Peoples R China
[3] Tsinghua Univ, Inst AI Ind Res, Beijing 100084, Peoples R China
[4] Enn Grp, Energy Res Inst, Langfang 065000, Peoples R China
关键词
Data models; Production facilities; Training; Servers; Computational modeling; Industrial Internet of Things; Fault diagnosis; Federated learning; Cloud computing; Machinery; Data heterogeneity; fault diagnosis; federated learning; knowledge distillation;
D O I
10.1109/TII.2024.3452229
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The lack of complete labels and data heterogeneity are obstacles to the application of artificial intelligence-based methods in industrial scenarios, such as machinery fault diagnosis. To address these challenges, this article proposes a federated learning (FL) framework for the industrial Internet of Things based on bidirectional knowledge distillation (KD) and hard sample selection. In the framework, the cloud server provides a pretrained deep learning (DL) model based on the cross-domain public dataset to facilitate the cold start in real-world applications. Then during the training process, each participating factory trains its heterogeneous local DL model according to local data volume and computing resources. Bidirectional KD with feature maps and hard sample selection is then carried out on a shared dataset between the server and factories to share knowledge efficiently. Moreover, all the DL models used in the application of the proposed framework are designed based on expertise and attention mechanism to diagnose multiple types of machinery and faults. Case studies using the vibration data collected from multiple factories show that the proposed framework improves the fault diagnosis accuracy compared to other FL methods while significantly reducing communication overhead.
引用
收藏
页码:1078 / 1089
页数:12
相关论文
共 50 条
  • [1] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [2] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Xiucheng Wang
    Nan Cheng
    Longfei Ma
    Ruijin Sun
    Rong Chai
    Ning Lu
    China Communications, 2023, 20 (02) : 61 - 78
  • [3] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Wang, Xiucheng
    Cheng, Nan
    Ma, Longfei
    Sun, Ruijin
    Chai, Rong
    Lu, Ning
    CHINA COMMUNICATIONS, 2023, 20 (02) : 61 - 78
  • [4] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
    Chen, Leiming
    Zhang, Weishan
    Dong, Cihao
    Zhao, Dehai
    Zeng, Xingjie
    Qiao, Sibo
    Zhu, Yichang
    Tan, Chee Wei
    ENTROPY, 2024, 26 (01)
  • [5] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [6] Heterogeneous Defect Prediction Based on Federated Transfer Learning via Knowledge Distillation
    Wang, Aili
    Zhang, Yutong
    Yan, Yixin
    IEEE ACCESS, 2021, 9 : 29530 - 29540
  • [7] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [8] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [10] FEDGKD: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
    Yao, Dezhong
    Pan, Wanning
    Dai, Yutong
    Wan, Yao
    Ding, Xiaofeng
    Yu, Chen
    Jin, Hai
    Xu, Zheng
    Sun, Lichao
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (01) : 3 - 17