DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION

被引:2
作者
Huang, Yue [1 ]
Kong, Lanju [1 ,2 ]
Li, Qingzhong [1 ,2 ]
Zhang, Baochen [1 ]
机构
[1] Shandong Univ, Sch Software, Jinan, Shandong, Peoples R China
[2] Dareway Software Co, Jinan, Peoples R China
来源
2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME | 2023年
关键词
Federated learning; mutual knowledge distillation; decentralized;
D O I
10.1109/ICME55011.2023.00066
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL), an emerging decentralized machine learning paradigm, supports the implementation of common modeling without compromising data privacy. In practical applications, FL participants heterogeneity poses a significant challenge for FL. Firstly, clients sometimes need to design custom models for various scenarios and tasks. Secondly, client drift leads to slow convergence of the global model. Recently, knowledge distillation has emerged to address this problem by using knowledge from heterogeneous clients to improve the model's performance. However, this approach requires the construction of a proxy dataset. And FL is usually performed with the assistance of a center, which can easily lead to trust issues and communication bottlenecks. To address these issues, this paper proposes a knowledge distillation-based FL scheme called FedDCM. Specifically, in this work, each participant maintains two models, a private model and a public model. The two models are mutual distillations, so there is no need to build proxy datasets to train teacher models. The approach allows for model heterogeneity, and each participant can have a private model of any architecture. The direct and efficient exchange of information between participants through the public model is more conducive to improving the participants' private models than a centralized server. Experimental results demonstrate the effectiveness of FedDCM, which offers better performance compared to s the most advanced methods.
引用
收藏
页码:342 / 347
页数:6
相关论文
共 50 条
  • [31] FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation
    Tang, Jianwu
    Ding, Xuefeng
    Hu, Dasha
    Guo, Bing
    Shen, Yuncheng
    Ma, Pan
    Jiang, Yuming
    [J]. SENSORS, 2023, 23 (14)
  • [32] A Network Resource Aware Federated Learning Approach using Knowledge Distillation
    Mishra, Rahul
    Gupta, Hari Prabhat
    Dutta, Tanima
    [J]. IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,
  • [33] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [34] Parameterized data-free knowledge distillation for heterogeneous federated learning
    Guo, Cheng
    He, Qianqian
    Tang, Xinyu
    Liu, Yining
    Jie, Yingmo
    [J]. KNOWLEDGE-BASED SYSTEMS, 2025, 317
  • [35] FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning
    Xu, Yikai
    Fan, Hongbo
    [J]. IEEE ACCESS, 2023, 11 : 72409 - 72417
  • [36] Energy-Efficient Federated Knowledge Distillation Learning in Internet of Drones
    Cal, Semih
    Sun, Xiang
    Yao, Jingjing
    [J]. 2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 1256 - 1261
  • [37] Knowledge Distillation Assisted Robust Federated Learning: Towards Edge Intelligence
    Qiao, Yu
    Adhikary, Apurba
    Kim, Ki Tae
    Zhang, Chaoning
    Hong, Choong Seon
    [J]. ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 843 - 848
  • [38] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    [J]. ELECTRONICS, 2024, 13 (17)
  • [39] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    [J]. 2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [40] Prototype-Decomposed Knowledge Distillation for Learning Generalized Federated Representation
    Wu, Aming
    Yu, Jiaping
    Wang, Yuxuan
    Deng, Cheng
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 10991 - 11002