Joint Client-and-Sample Selection for Federated Learning via Bi-Level Optimization

被引:0
|
作者
Li, Anran [1 ]
Wang, Guangjing [2 ]
Hu, Ming [3 ]
Sun, Jianfei [3 ]
Zhang, Lan [4 ]
Tuan, Luu Anh [5 ]
Yu, Han [5 ]
机构
[1] Yale Univ, Sch Med, Dept Biomed Informat & Data Sci, New Haven, CT 06520 USA
[2] Univ S Florida, Dept Comp Sci & Engn, Tampa, FL 33620 USA
[3] Singapore Management Univ, Sch Comp & Informat Syst, Singapore 188065, Singapore
[4] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230026, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会; 中央高校基本科研业务费专项资金资助; 国家重点研发计划;
关键词
Training; Computational modeling; Data models; Noise measurement; Noise; Optimization; Servers; Bi-level optimization; federated learning; noisy data detection; sample selection;
D O I
10.1109/TMC.2024.3455331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables massive local data owners to collaboratively train a deep learning model without disclosing their private data. The importance of local data samples from various data owners to FL models varies widely. This is exacerbated by the presence of noisy data that exhibit large losses similar to important (hard) samples. Currently, there lacks an FL approach that can effectively distinguish hard samples (which are beneficial) from noisy samples (which are harmful). To bridge this gap, we propose the joint Federated Meta-Weighting based Client and Sample Selection (FedMW-CSS) approach to simultaneously mitigate label noise and hard sample selection. It is a bilevel optimization approach for FL client-and-sample selection and global model construction to achieve hard sample-aware noise-robust learning in a privacy preserving manner. It performs meta-learning based online approximation to iteratively update global FL models, select the most positively influential samples and deal with training data noise. To utilize both the instance-level information and class-level information for better performance improvements, FedMW-CSS efficiently learns a class-level weight by manipulating gradients at the class level, e.g., it performs a gradient descent step on class-level weights, which only relies on intermediate gradients. Theoretically, we analyze the privacy guarantees and convergence of FedMW-CSS. Extensive experiments comparison against eight state-of-the-art baselines on six real-world datasets in the presence of data noise and heterogeneity shows that FedMW-CSS achieves up to 28.5% higher test accuracy, while saving communication and computation costs by at least 49.3% and 1.2%, respectively.
引用
收藏
页码:15196 / 15209
页数:14
相关论文
共 50 条
  • [21] Adaptive Client Selection in Resource Constrained Federated Learning Systems: A Deep Reinforcement Learning Approach
    Zhang, Hangjia
    Xie, Zhijun
    Zarei, Roozbeh
    Wu, Tao
    Chen, Kewei
    IEEE ACCESS, 2021, 9 : 98423 - 98432
  • [22] Client-Side Optimization Strategies for Communication-Efficient Federated Learning
    Mills, Jed
    Hu, Jia
    Min, Geyong
    IEEE COMMUNICATIONS MAGAZINE, 2022, 60 (07) : 60 - 66
  • [23] Client Selection for Federated Learning With Non-IID Data in Mobile Edge Computing
    Zhang, Wenyu
    Wang, Xiumin
    Zhou, Pan
    Wu, Weiwei
    Zhang, Xinglin
    IEEE ACCESS, 2021, 9 : 24462 - 24474
  • [24] Jointly Optimizing Client Selection and Resource Management in Wireless Federated Learning for Internet of Things
    Yu, Liangkun
    Albelaihi, Rana
    Sun, Xiang
    Ansari, Nirwan
    Devetsikiotis, Michael
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (06) : 4385 - 4395
  • [25] Delay-Constrained Client Selection for Heterogeneous Federated Learning in Intelligent Transportation Systems
    Zhang, Weiwen
    Chen, Yanxi
    Jiang, Yifeng
    Liu, Jianqi
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (01): : 1042 - 1054
  • [26] PriFairFed: A Local Differentially Private Federated Learning Algorithm for Client-Level Fairness
    Hu, Chuang
    Wu, Nanxi
    Shi, Siping
    Liu, Xuan
    Luo, Bing
    Wang, Kanye Ye
    Jiang, Jiawei
    Cheng, Dazhao
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (05) : 3993 - 4005
  • [27] FedMCCS: Multicriteria Client Selection Model for Optimal IoT Federated Learning
    AbdulRahman, Sawsan
    Tout, Hanine
    Mourad, Azzam
    Talhi, Chamseddine
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (06) : 4723 - 4735
  • [28] Joint Client and Cross-Client Edge Selection for Cost-Efficient Federated Learning of Graph Convolutional Networks
    Huang, Guangjing
    Chen, Xu
    Wu, Qiong
    Li, Jingyi
    Huang, Qianyi
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024,
  • [29] Joint Client Selection and Receive Beamforming for Over-the-Air Federated Learning With Energy Harvesting
    Chen, Caijuan
    Chiang, Yi-Han
    Lin, Hai
    Lui, John C. S.
    Ji, Yusheng
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2023, 4 : 1127 - 1140
  • [30] Context-Aware Online Client Selection for Hierarchical Federated Learning
    Qu, Zhe
    Duan, Rui
    Chen, Lixing
    Xu, Jie
    Lu, Zhuo
    Liu, Yao
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) : 4353 - 4367