Two-stage model fusion scheme based on knowledge distillation for stragglers in federated learning

被引:0
作者
Xu, Jiuyun [1 ]
Li, Xiaowen [1 ]
Zhu, Kongshang [1 ]
Zhou, Liang [1 ]
Zhao, Yingzhi [1 ]
机构
[1] China Univ Petr East China, Qingdao Inst Software, Coll Comp Sci & Technol, 66 Changjiang West Rd, Qingdao 266580, Peoples R China
关键词
Federated learning; Straggler problem; Knowledge distillation; Heterogeneity; Training efficiency;
D O I
10.1007/s13042-024-02436-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning, as an emerging distributed learning paradigm, enables devices (also called clients) storing local data to collaboratively participate in a training task without the data leaving the devices, aiming to achieve the effect of integrating multiparty data while meeting privacy protection requirements. However, the participating clients are autonomous entities in a real-world environment, with heterogeneity and network instability, which leads to FL being plagued by stragglers when intermediate training results are synchronously interacted. To this end, this paper proposes a new FL scheme with a two-stage fusion process based on knowledge distillation, which transfers knowledge of straggler models to the global model without delaying the training speed, thus balancing efficiency and model performance. We have evaluated the proposed algorithm on three popular datasets. The experimental results show that FedTd improves training efficiency and maintains good model accuracy compared to baseline methods under heterogeneous conditions, exhibiting strong robustness against stragglers. By our approach, the running time can be accelerated by 1.97-3.32x\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$3.32\times$$\end{document} under scenarios with higher level of data heterogeneity.
引用
收藏
页码:3067 / 3083
页数:17
相关论文
共 50 条
  • [21] Federated Split Learning via Mutual Knowledge Distillation
    Luo, Linjun
    Zhang, Xinglin
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
  • [22] FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
    Han, Sungwon
    Park, Sungwon
    Wu, Fangzhao
    Kim, Sundong
    Wu, Chuhan
    Xie, Xing
    Cha, Meeyoung
    COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 691 - 707
  • [23] Heterogeneous Defect Prediction Based on Federated Transfer Learning via Knowledge Distillation
    Wang, Aili
    Zhang, Yutong
    Yan, Yixin
    IEEE ACCESS, 2021, 9 : 29530 - 29540
  • [24] FedCVG: a two-stage robust federated learning optimization algorithm
    Runze Zhang
    Yang Zhang
    Yating Zhao
    Bin Jia
    Wenjuan Lian
    Scientific Reports, 15 (1)
  • [25] A Novel Two-Stage Knowledge Distillation Framework for Skeleton-Based Action Prediction
    Liu, Cuiwei
    Zhao, Xiaoxue
    Li, Zhaokui
    Yan, Zhuo
    Du, Chong
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1918 - 1922
  • [26] 1-4edGraph-KD: An Effective Federated Graph Learning Scheme Based on Knowledge Distillation
    Wang, Shiyu
    Xie, Jiahao
    Lu, Mingming
    Xiong, Neal N.
    2023 IEEE 9TH INTL CONFERENCE ON BIG DATA SECURITY ON CLOUD, BIGDATASECURITY, IEEE INTL CONFERENCE ON HIGH PERFORMANCE AND SMART COMPUTING, HPSC AND IEEE INTL CONFERENCE ON INTELLIGENT DATA AND SECURITY, IDS, 2023, : 130 - 134
  • [27] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Wang, Xiucheng
    Cheng, Nan
    Ma, Longfei
    Sun, Ruijin
    Chai, Rong
    Lu, Ning
    CHINA COMMUNICATIONS, 2023, 20 (02) : 61 - 78
  • [28] Adaptive Block-Wise Regularization and Knowledge Distillation for Enhancing Federated Learning
    Liu, Jianchun
    Zeng, Qingmin
    Xu, Hongli
    Xu, Yang
    Wang, Zhiyuan
    Huang, He
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (01) : 791 - 805
  • [29] Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning
    Xiucheng Wang
    Nan Cheng
    Longfei Ma
    Ruijin Sun
    Rong Chai
    Ning Lu
    ChinaCommunications, 2023, 20 (02) : 61 - 78
  • [30] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987