Federated Noisy Client Learning

被引:10
|
作者
Tam, Kahou [1 ]
Li, Li [1 ]
Han, Bo [2 ]
Xu, Chengzhong [1 ]
Fu, Huazhu [3 ]
机构
[1] Univ Macau, State Key Lab Internet Things Smart City, Macau, Peoples R China
[2] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
[3] ASTAR, Inst High Performance Comp, Singapore 138632, Singapore
关键词
Noise measurement; Data models; Training; Computational modeling; Adaptation models; Training data; Servers; Federated learning (FL); label noise; noisy client; noisy learning;
D O I
10.1109/TNNLS.2023.3336050
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) collaboratively trains a shared global model depending on multiple local clients, while keeping the training data decentralized to preserve data privacy. However, standard FL methods ignore the noisy client issue, which may harm the overall performance of the shared model. We first investigate the critical issue caused by noisy clients in FL and quantify the negative impact of the noisy clients in terms of the representations learned by different layers. We have the following two key observations: 1) the noisy clients can severely impact the convergence and performance of the global model in FL and 2) the noisy clients can induce greater bias in the deeper layers than the former layers of the global model. Based on the above observations, we propose federated noisy client learning (Fed-NCL), a framework that conducts robust FL with noisy clients. Specifically, Fed-NCL first identifies the noisy clients through well estimating the data quality and model divergence. Then robust layerwise aggregation is proposed to adaptively aggregate the local models of each client to deal with the data heterogeneity caused by the noisy clients. We further perform label correction on the noisy clients to improve the generalization of the global model. Experimental results on various datasets demonstrate that our algorithm boosts the performances of different state-of-the-art systems with noisy clients. Our code is available at https://github.com/TKH666/Fed-NCL.
引用
收藏
页码:1799 / 1812
页数:14
相关论文
共 50 条
  • [1] Are You a Good Client? Client Classification in Federated Learning
    Jeong, Hyejun
    An, Jaeju
    Jeong, Jaehoon
    12TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE (ICTC 2021): BEYOND THE PANDEMIC ERA WITH ICT CONVERGENCE INNOVATION, 2021, : 1691 - 1696
  • [2] Federated Learning with Client Availability Budgets
    Bao, Yunkai
    Drew, Steve
    Wang, Xin
    Zhou, Jiayu
    Niu, Xiaoguang
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1902 - 1907
  • [3] Reuse of Client Models in Federated Learning
    Cao, Bokai
    Wu, Weigang
    Zhan, Congcong
    Zhou, Jieying
    2022 IEEE INTERNATIONAL CONFERENCE ON SMART COMPUTING (SMARTCOMP 2022), 2022, : 356 - 361
  • [4] Client Selection in Hierarchical Federated Learning
    Trindade, Silvana
    da Fonseca, Nelson L. S.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28480 - 28495
  • [5] Client Selection for Federated Bayesian Learning
    Yang, Jiarong
    Liu, Yuan
    Kassab, Rahif
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 915 - 928
  • [6] Robust Federated Learning With Noisy Labels
    Yang, Seunghan
    Park, Hyoungseob
    Byun, Junyoung
    Kim, Changick
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (02) : 35 - 43
  • [7] Federated Learning over Noisy Channels
    Wei, Xizixiang
    Shen, Cong
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [8] Robust Federated Learning With Noisy Communication
    Ang, Fan
    Chen, Li
    Zhao, Nan
    Chen, Yunfei
    Wang, Weidong
    Yu, F. Richard
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2020, 68 (06) : 3452 - 3464
  • [9] Federated Learning with Noisy User Feedback
    Sharma, Rahul
    Ramakrishna, Anil
    MacLaughlin, Ansel
    Rumshisky, Anna
    Majmudar, Jimit
    Chung, Clement
    Avestimehr, Salman
    Gupta, Rahul
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 2726 - 2739
  • [10] Learning Cautiously in Federated Learning with Noisy and Heterogeneous Clients
    Wu, Chenrui
    Li, Zexi
    Wang, Fangxin
    Wu, Chao
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 660 - 665