FedDKD: Federated learning with decentralized knowledge distillation

被引:0
作者
Xinjia Li
Boyu Chen
Wenlian Lu
机构
[1] Fudan University,School of Mathematical Sciences
[2] Fudan University,Shanghai Center for Mathematical Sciences
[3] Fudan University,Shanghai Key Laboratory for Contemporary Applied Mathematics
来源
Applied Intelligence | 2023年 / 53卷
关键词
Federated learning; Knowledge distillation; Heterogeneous data; Data-free algorithm;
D O I
暂无
中图分类号
学科分类号
摘要
The heterogeneity of the data distribution generally influences federated learning performance in neural networks. For a well-performing global model, taking a weighted average of the local models, as in most existing federated learning algorithms, may not guarantee consistency with local models in the space of neural network maps. In this paper, we highlight the significance of the space of neural network maps to relieve the performance decay produced by data heterogeneity and propose a novel federated learning framework equipped with the decentralized knowledge distillation process (FedDKD). In FedDKD, we introduce a decentralized knowledge distillation (DKD) module to distill the knowledge of local models to teach the global model approaching the neural network map average by optimizing the divergence defined in the loss function, other than only averaging parameters as in the literature. Numerical experiments on various heterogeneous datasets reveal that FedDKD outperforms the state-of-the-art methods, especially on some extremely heterogeneous datasets.
引用
收藏
页码:18547 / 18563
页数:16
相关论文
共 57 条
  • [1] Li T(2020)Federated learning: Challenges, methods, and future directions IEEE Signal Proc Mag 37 50-60
  • [2] Sahu AK(2022)A survey on federated learning for resource-constrained iot devices IEEE Internet of Things Journal 9 1-24
  • [3] Talwalkar A(2021)Federated machine learning: Survey, multi-level classification, desirable criteria and future directions in communication and networking systems IEEE Communications Surveys & Tutorials 23 1342-1397
  • [4] Smith V(2022)Secure and efficient federated learning for smart grid with edge-cloud collaboration IEEE Transactions on Industrial Informatics 18 1333-1344
  • [5] Imteaj A(2021)Dynamic fusion-based federated learning for covid-19 detection IEEE Internet of Things Journal PP 1-1
  • [6] Thakker U(2020)Federated optimization in heterogeneous networks Proc Mach Learn Syst 2 429-450
  • [7] Wang S(2021)Federated learning on non-iid data: a survey Neurocomputing 465 371-390
  • [8] Li J(2022)Multi-task federated learning for personalised deep neural networks in edge computing IEEE Transactions on Parallel and Distributed Systems 33 630-641
  • [9] Amini MH(2020)Adaptive multi-teacher multi-level knowledge distillation Neurocomputing 415 106-113
  • [10] Wahab OA(2020)Mix2fld: Downlink federated learning after uplink federated distillation with two-way mixup IEEE Commun Lett PP 1-1