Joint Client-and-Sample Selection for Federated Learning via Bi-Level Optimization

被引:0
|
作者
Li, Anran [1 ]
Wang, Guangjing [2 ]
Hu, Ming [3 ]
Sun, Jianfei [3 ]
Zhang, Lan [4 ]
Tuan, Luu Anh [5 ]
Yu, Han [5 ]
机构
[1] Yale Univ, Sch Med, Dept Biomed Informat & Data Sci, New Haven, CT 06520 USA
[2] Univ S Florida, Dept Comp Sci & Engn, Tampa, FL 33620 USA
[3] Singapore Management Univ, Sch Comp & Informat Syst, Singapore 188065, Singapore
[4] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230026, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会; 中央高校基本科研业务费专项资金资助; 国家重点研发计划;
关键词
Training; Computational modeling; Data models; Noise measurement; Noise; Optimization; Servers; Bi-level optimization; federated learning; noisy data detection; sample selection;
D O I
10.1109/TMC.2024.3455331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables massive local data owners to collaboratively train a deep learning model without disclosing their private data. The importance of local data samples from various data owners to FL models varies widely. This is exacerbated by the presence of noisy data that exhibit large losses similar to important (hard) samples. Currently, there lacks an FL approach that can effectively distinguish hard samples (which are beneficial) from noisy samples (which are harmful). To bridge this gap, we propose the joint Federated Meta-Weighting based Client and Sample Selection (FedMW-CSS) approach to simultaneously mitigate label noise and hard sample selection. It is a bilevel optimization approach for FL client-and-sample selection and global model construction to achieve hard sample-aware noise-robust learning in a privacy preserving manner. It performs meta-learning based online approximation to iteratively update global FL models, select the most positively influential samples and deal with training data noise. To utilize both the instance-level information and class-level information for better performance improvements, FedMW-CSS efficiently learns a class-level weight by manipulating gradients at the class level, e.g., it performs a gradient descent step on class-level weights, which only relies on intermediate gradients. Theoretically, we analyze the privacy guarantees and convergence of FedMW-CSS. Extensive experiments comparison against eight state-of-the-art baselines on six real-world datasets in the presence of data noise and heterogeneity shows that FedMW-CSS achieves up to 28.5% higher test accuracy, while saving communication and computation costs by at least 49.3% and 1.2%, respectively.
引用
收藏
页码:15196 / 15209
页数:14
相关论文
共 50 条
  • [41] Online Client Selection for Asynchronous Federated Learning With Fairness Consideration
    Zhu, Hongbin
    Zhou, Yong
    Qian, Hua
    Shi, Yuanming
    Chen, Xu
    Yang, Yang
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (04) : 2493 - 2506
  • [42] Provable Privacy Advantages of Decentralized Federated Learning via Distributed Optimization
    Yu, Wenrui
    Li, Qiongxiu
    Lopuhaa-Zwakenberg, Milan
    Christensen, Mads Graesboll
    Heusdens, Richard
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2025, 20 : 822 - 838
  • [43] Joint Client Selection and Privacy Compensation for Differentially Private Federated Learning
    Xu, Ruichen
    Zhang, Ying-Jun Angela
    Huang, Jianwei
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS, INFOCOM WKSHPS 2024, 2024,
  • [44] Projection-Free Stochastic Bi-Level Optimization
    Akhtar, Zeeshan
    Bedi, Amrit Singh
    Thomdapu, Srujan Teja
    Rajawat, Ketan
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 6332 - 6347
  • [45] Reputation-Aware Federated Learning Client Selection Based on Stochastic Integer Programming
    Tan, Xavier
    Ng, Wei
    Lim, Wei
    Xiong, Zehui
    Niyato, Dusit
    Yu, Han
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (06) : 953 - 964
  • [46] Long-Term Client Selection for Federated Learning With Non-IID Data: A Truthful Auction Approach
    Tan, Jinghong
    Liu, Zhian
    Guo, Kun
    Zhao, Mingxiong
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (05): : 4953 - 4970
  • [47] Joint Client and Resource Optimization for Federated Learning in Wireless IoT Networks
    Zhao, Jie
    Ni, Yiyang
    Cheng, Yulun
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [48] Enhancing Federated Learning With Spectrum Allocation Optimization and Device Selection
    Zhang, Tinghao
    Lam, Kwok-Yan
    Zhao, Jun
    Li, Feng
    Han, Huimei
    Jamil, Norziana
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2023, 31 (05) : 1981 - 1996
  • [49] AMFL: Asynchronous Multi-level Federated Learning with Client Selection
    Li, Xuerui
    Zhao, Yangming
    Qiao, Chunming
    2024 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA, ICCC, 2024,
  • [50] A Client Selection Method Based on Loss Function Optimization for Federated Learning
    Zeng, Yan
    Teng, Siyuan
    Xiang, Tian
    Zhang, Jilin
    Mu, Yuankai
    Ren, Yongjian
    Wan, Jian
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 137 (01): : 1047 - 1064