DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION

被引:3
作者
Huang, Yue [1 ]
Kong, Lanju [1 ,2 ]
Li, Qingzhong [1 ,2 ]
Zhang, Baochen [1 ]
机构
[1] Shandong Univ, Sch Software, Jinan, Shandong, Peoples R China
[2] Dareway Software Co, Jinan, Peoples R China
来源
2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME | 2023年
关键词
Federated learning; mutual knowledge distillation; decentralized;
D O I
10.1109/ICME55011.2023.00066
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL), an emerging decentralized machine learning paradigm, supports the implementation of common modeling without compromising data privacy. In practical applications, FL participants heterogeneity poses a significant challenge for FL. Firstly, clients sometimes need to design custom models for various scenarios and tasks. Secondly, client drift leads to slow convergence of the global model. Recently, knowledge distillation has emerged to address this problem by using knowledge from heterogeneous clients to improve the model's performance. However, this approach requires the construction of a proxy dataset. And FL is usually performed with the assistance of a center, which can easily lead to trust issues and communication bottlenecks. To address these issues, this paper proposes a knowledge distillation-based FL scheme called FedDCM. Specifically, in this work, each participant maintains two models, a private model and a public model. The two models are mutual distillations, so there is no need to build proxy datasets to train teacher models. The approach allows for model heterogeneity, and each participant can have a private model of any architecture. The direct and efficient exchange of information between participants through the public model is more conducive to improving the participants' private models than a centralized server. Experimental results demonstrate the effectiveness of FedDCM, which offers better performance compared to s the most advanced methods.
引用
收藏
页码:342 / 347
页数:6
相关论文
共 50 条
[21]   A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection [J].
Zhao, Ruijie ;
Yang, Linbo ;
Wang, Yijun ;
Xue, Zhi ;
Gui, Guan ;
Ohtsukit, Tomoaki .
IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, :2688-2693
[22]   FedDT: A Communication-Efficient Federated Learning via Knowledge Distillation and Ternary Compression [J].
He, Zixiao ;
Zhu, Gengming ;
Zhang, Shaobo ;
Luo, Entao ;
Zhao, Yijiang .
ELECTRONICS, 2025, 14 (11)
[23]   Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation [J].
Mohammed, Malik Naik ;
Zhang, Xinyue ;
Valero, Maria ;
Xie, Ying .
2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE, 2023, :207-208
[24]   Resource-Aware Knowledge Distillation for Federated Learning [J].
Chen, Zheyi ;
Tian, Pu ;
Liao, Weixian ;
Chen, Xuhui ;
Xu, Guobin ;
Yu, Wei .
IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) :706-719
[25]   Hierarchical Federated Learning in MEC Networks with Knowledge Distillation [J].
Tuan Dung Nguyen ;
Ngoc Anh Tong ;
Nguyent, Binh P. ;
Quoc Viet Hung Nguyen ;
Phi Le Nguyen ;
Thanh Trung Huynh .
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024, 2024,
[26]   FedX: Unsupervised Federated Learning with Cross Knowledge Distillation [J].
Han, Sungwon ;
Park, Sungwon ;
Wu, Fangzhao ;
Kim, Sundong ;
Wu, Chuhan ;
Xie, Xing ;
Cha, Meeyoung .
COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 :691-707
[27]   IoT Urban River Water Quality System using Federated Learning via Knowledge Distillation [J].
Dahane, Amine ;
Benameur, Rabaie ;
Naloufi, Manel ;
Souihi, Sami ;
Abreu, Thiago ;
Lucas, Francoise S. ;
Mellouk, Abdelhamid .
ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, :1515-1520
[28]   Federated Learning via Augmented Knowledge Distillation for Heterogenous Deep Human Activity Recognition Systems [J].
Gad, Gad ;
Fadlullah, Zubair .
SENSORS, 2023, 23 (01)
[29]   FedCD: Personalized Federated Learning via Collaborative Distillation [J].
Ahmad, Sabtain ;
Aral, Atakan .
2022 IEEE/ACM 15TH INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING, UCC, 2022, :189-194
[30]   An Optimized Unsupervised Defect Detection Approach via Federated Learning and Adaptive Embeddings Knowledge Distillation [J].
Wang, Jinhai ;
Xue, Junwei ;
Zhang, Hongyan ;
Xiao, Hui ;
Wei, Huiling ;
Chen, Mingyou ;
Liao, Jiang ;
Luo, Lufeng .
CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 84 (01) :1839-1861