Personalized Decentralized Federated Learning with Knowledge Distillation

被引:2
作者
Jeong, Eunjeong [1 ]
Kountouris, Marios [1 ]
机构
[1] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
来源
ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS | 2023年
关键词
decentralized federated learning; personalization; knowledge distillation;
D O I
10.1109/ICC45041.2023.10279714
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.
引用
收藏
页码:1982 / 1987
页数:6
相关论文
共 50 条
  • [41] Incentive and Knowledge Distillation Based Federated Learning for Cross-Silo Applications
    Li, Beibei
    Shi, Yaxin
    Guo, Yuqing
    Kong, Qinglei
    Jiang, Yukun
    IEEE INFOCOM 2022 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2022,
  • [42] FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning
    Han, Pengchao
    Shi, Xingyan
    Huang, Jianwei
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2024, 42 (11) : 3064 - 3077
  • [43] One Teacher is Enough: A Server-Clueless Federated Learning With Knowledge Distillation
    Ning, Wanyi
    Qi, Qi
    Wang, Jingyu
    Zhu, Mengde
    Li, Shaolong
    Yang, Guang
    Liao, Jianxin
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (05) : 2704 - 2718
  • [44] Layer-wise Knowledge Distillation for Cross-Device Federated Learning
    Le, Huy Q.
    Nguyen, Loc X.
    Park, Seong-Bae
    Hong, Choong Seon
    2023 INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN, 2023, : 526 - 529
  • [45] Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning
    Chadha, Mohak
    Khera, Pulkit
    Gu, Jianfeng
    Abboud, Osama
    Gerndt, Michael
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 997 - 1006
  • [46] Bearing Faulty Prediction Method Based on Federated Transfer Learning and Knowledge Distillation
    Zhou, Yiqing
    Wang, Jian
    Wang, Zeru
    MACHINES, 2022, 10 (05)
  • [47] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Wang, Xiucheng
    Cheng, Nan
    Ma, Longfei
    Sun, Ruijin
    Chai, Rong
    Lu, Ning
    CHINA COMMUNICATIONS, 2023, 20 (02) : 61 - 78
  • [48] Federated learning by employing knowledge distillation on edge devices with limited hardware resources
    Tanghatari, Ehsan
    Kamal, Mehdi
    Afzali-Kusha, Ali
    Pedram, Massoud
    NEUROCOMPUTING, 2023, 531 : 87 - 99
  • [49] Brain Tumors Classification in MRIs Based on Personalized Federated Distillation Learning With Similarity-Preserving
    Wu, Bo
    Shi, Donghui
    Aguilar, Jose
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2025, 35 (02)
  • [50] A personalized federated cloud-edge collaboration framework via cross-client knowledge distillation
    Zhang, Shining
    Wang, Xingwei
    Zeng, Rongfei
    Zeng, Chao
    Li, Ying
    Huang, Min
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 165