Joint Client-and-Sample Selection for Federated Learning via Bi-Level Optimization

被引:0
|
作者
Li, Anran [1 ]
Wang, Guangjing [2 ]
Hu, Ming [3 ]
Sun, Jianfei [3 ]
Zhang, Lan [4 ]
Tuan, Luu Anh [5 ]
Yu, Han [5 ]
机构
[1] Yale Univ, Sch Med, Dept Biomed Informat & Data Sci, New Haven, CT 06520 USA
[2] Univ S Florida, Dept Comp Sci & Engn, Tampa, FL 33620 USA
[3] Singapore Management Univ, Sch Comp & Informat Syst, Singapore 188065, Singapore
[4] Univ Sci & Technol China, Sch Comp Sci & Technol, Hefei 230026, Peoples R China
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
基金
新加坡国家研究基金会; 中央高校基本科研业务费专项资金资助; 国家重点研发计划;
关键词
Training; Computational modeling; Data models; Noise measurement; Noise; Optimization; Servers; Bi-level optimization; federated learning; noisy data detection; sample selection;
D O I
10.1109/TMC.2024.3455331
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables massive local data owners to collaboratively train a deep learning model without disclosing their private data. The importance of local data samples from various data owners to FL models varies widely. This is exacerbated by the presence of noisy data that exhibit large losses similar to important (hard) samples. Currently, there lacks an FL approach that can effectively distinguish hard samples (which are beneficial) from noisy samples (which are harmful). To bridge this gap, we propose the joint Federated Meta-Weighting based Client and Sample Selection (FedMW-CSS) approach to simultaneously mitigate label noise and hard sample selection. It is a bilevel optimization approach for FL client-and-sample selection and global model construction to achieve hard sample-aware noise-robust learning in a privacy preserving manner. It performs meta-learning based online approximation to iteratively update global FL models, select the most positively influential samples and deal with training data noise. To utilize both the instance-level information and class-level information for better performance improvements, FedMW-CSS efficiently learns a class-level weight by manipulating gradients at the class level, e.g., it performs a gradient descent step on class-level weights, which only relies on intermediate gradients. Theoretically, we analyze the privacy guarantees and convergence of FedMW-CSS. Extensive experiments comparison against eight state-of-the-art baselines on six real-world datasets in the presence of data noise and heterogeneity shows that FedMW-CSS achieves up to 28.5% higher test accuracy, while saving communication and computation costs by at least 49.3% and 1.2%, respectively.
引用
收藏
页码:15196 / 15209
页数:14
相关论文
共 50 条
  • [31] Embracing Federated Learning: Enabling Weak Client Participation via Partial Model Training
    Lee, Sunwoo
    Zhang, Tuo
    Prakash, Saurav
    Niu, Yue
    Avestimehr, Salman
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (12) : 11133 - 11143
  • [32] Asynchronous Wireless Federated Learning With Probabilistic Client Selection
    Yang, Jiarong
    Liu, Yuan
    Chen, Fangjiong
    Chen, Wen
    Li, Changle
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (07) : 7144 - 7158
  • [33] Age of Information Based Client Selection for Wireless Federated Learning With Diversified Learning Capabilities
    Dong, Liran
    Zhou, Yiqing
    Liu, Ling
    Qi, Yanli
    Zhang, Yu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (12) : 14934 - 14945
  • [34] Data Distribution-Aware Online Client Selection Algorithm for Federated Learning in Heterogeneous Networks
    Lee, Jaewook
    Ko, Haneul
    Seo, Sangwon
    Pack, Sangheon
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2023, 72 (01) : 1127 - 1136
  • [35] Joint Client Scheduling and Quantization Optimization in Energy Harvesting-Enabled Federated Learning Networks
    Ni, Zhengwei
    Zhang, Zhaoyang
    Luong, Nguyen Cong
    Niyato, Dusit
    Kim, Dong In
    Feng, Shaohan
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (08) : 9566 - 9582
  • [36] Vehicle Selection and Resource Optimization for Federated Learning in Vehicular Edge Computing
    Xiao, Huizi
    Zhao, Jun
    Pei, Qingqi
    Feng, Jie
    Liu, Lei
    Shi, Weisong
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (08) : 11073 - 11087
  • [37] Federated Transfer Learning With Client Selection for Intrusion Detection in Mobile Edge Computing
    Cheng, Yanyu
    Lu, Jianyuan
    Niyato, Dusit
    Lyu, Biao
    Kang, Jiawen
    Zhu, Shunmin
    IEEE COMMUNICATIONS LETTERS, 2022, 26 (03) : 552 - 556
  • [38] NIFL: A Statistical Measures-Based Method for Client Selection in Federated Learning
    Mohamed, M'haouach
    Houdou, Anass
    Alami, Hamza
    Fardousse, Khalid
    Berrada, Ismail
    IEEE ACCESS, 2022, 10 : 124766 - 124776
  • [39] A review on client selection models in federated learning
    Panigrahi, Monalisa
    Bharti, Sourabh
    Sharma, Arun
    WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 13 (06)
  • [40] Joint Age-Based Client Selection and Resource Allocation for Communication-Efficient Federated Learning Over NOMA Networks
    Wu, Bibo
    Fang, Fang
    Wang, Xianbin
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2024, 72 (01) : 179 - 192