Federated transfer learning with consensus knowledge distillation for intelligent fault diagnosis under data privacy preserving

被引:1
作者
Xue, Xingan [1 ]
Zhao, Xiaoping [2 ]
Zhang, Yonghong [1 ]
Ma, Mengyao [2 ]
Bu, Can [3 ]
Peng, Peng [2 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Automat, Nanjing 210044, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Sch Comp Sci, Nanjing 210044, Peoples R China
[3] Nanjing Normal Univ, Sch Elect & Automat Engn, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
fault diagnosis; federated learning; transfer learning; consensus knowledge distillation; mutual information regularization; ROTATING MACHINERY;
D O I
10.1088/1361-6501/acf77d
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Fault diagnosis with deep learning has garnered substantial research. However, the establishment of a model is contingent upon a volume of data. Moreover, centralizing the data from each device faces the problem of privacy leakage. Federated learning can cooperate with each device to form a global model without violating data privacy. Due to the data distribution discrepancy for each device, a global model trained only by the source client with labeled data fails to match the target client without labeled data. To overcome this issue, this research suggests a federated transfer learning method. A consensus knowledge distillation is adopted to train the extended target domain model. A mutual information regularization is introduced to further learn the structure information of the target client data. The source client and the extended target models are aggregated to improve model performance. The experimental results demonstrate that our method has broad application prospects.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] FedGKD: Federated Graph Knowledge Distillation for privacy-preserving rumor detection
    Zheng, Peng
    Dou, Yong
    Yan, Yeqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [22] Federated learning for intelligent fault diagnosis based on similarity collaboration
    Zhang, Yonghong
    Xue, Xingan
    Zhao, Xiaoping
    Wang, Lihua
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (04)
  • [23] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [24] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [25] Privacy-preserving federated learning with non-transfer learning
    Xu M.
    Li X.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2023, 50 (04): : 89 - 99
  • [26] Privacy-Preserving Breast Cancer Classification: A Federated Transfer Learning Approach
    Selvakanmani, S.
    Devi, G. Dharani
    Rekha, V
    Jeyalakshmi, J.
    JOURNAL OF IMAGING INFORMATICS IN MEDICINE, 2024, 37 (04): : 1488 - 1504
  • [27] Class-Imbalance Privacy-Preserving Federated Learning for Decentralized Fault Diagnosis With Biometric Authentication
    Lu, Shixiang
    Gao, Zhiwei
    Xu, Qifa
    Jiang, Cuixia
    Zhang, Aihua
    Wang, Xiangxiang
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (12) : 9101 - 9111
  • [28] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [29] Federated learning for preserving data privacy in collaborative healthcare research
    Loftus, Tyler J.
    Ruppert, Matthew M.
    Shickel, Benjamin
    Ozrazgat-Baslanti, Tezcan
    Balch, Jeremy A.
    Efron, Philip A.
    Upchurch, Gilbert R.
    Rashidi, Parisa
    Tignanelli, Christopher
    Bian, Jiang
    Bihorac, Azra
    DIGITAL HEALTH, 2022, 8
  • [30] Privacy-Preserving Federated Learning Model for Healthcare Data
    Ul Islam, Tanzir
    Ghasemi, Reza
    Mohammed, Noman
    2022 IEEE 12TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE (CCWC), 2022, : 281 - 287