Poster: AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation

被引:2
作者
Mohammed, Malik Naik [1 ]
Zhang, Xinyue [1 ]
Valero, Maria [1 ]
Xie, Ying [1 ]
机构
[1] Kennesaw State Univ, Marietta, GA 30060 USA
来源
2023 IEEE/ACM CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES, CHASE | 2023年
关键词
Federated Learning; Knowledge Distillation; Focal Loss;
D O I
10.1145/3580252.3589436
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) allows for the decentralized training of a global model on edge devices without transferring data samples, thus preserving privacy. Due to the ubiquitous wearable devices and mobile devices with health applications, FL has shown promise in the medical field for applications such as medical imaging, disease diagnosis, and electronic health record (EHR) analysis. However, slower edge devices with limited resources can slow down the training process. To address this issue and increase efficiency, we propose the use of Asynchronous Federated Learning with Knowledge Distillation (AsyncFedKD). AsyncFedKD asynchronously trains a lightweight global student model using a pre-trained teacher model, preventing a decrease in training efficiency due to slow edge devices. The knowledge distillation aspect of AsyncFedKD effectively compresses the size of model parameters for efficient communication during training. AsyncFedKD has been tested on a sensitive mammography cancer dataset and achieved an accuracy of 88% on the global model.
引用
收藏
页码:207 / 208
页数:2
相关论文
共 6 条
  • [1] Lin TY, 2018, Arxiv, DOI arXiv:1708.02002
  • [2] SUCKLING J, 1994, INT CONGR SER, V1069, P375
  • [3] Tan CQ, 2018, Arxiv, DOI [arXiv:1808.01974, 10.48550/arXiv.1808.01974]
  • [4] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2022, 13 (01)
  • [5] Xie C, 2020, Arxiv, DOI arXiv:1903.03934
  • [6] Xu CH, 2022, Arxiv, DOI [arXiv:2109.04269, DOI 10.1016/J.COSREV.2023.100595]