Bi-level Sampling: Improved Clients Selection in Heterogeneous Settings for Federated Learning

被引:0
作者
Xiao, Danyang [1 ]
Zhan, Congcong [1 ]
Li, Jialun [1 ]
Wu, Weigang [1 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Peoples R China
来源
2023 IEEE INTERNATIONAL PERFORMANCE, COMPUTING, AND COMMUNICATIONS CONFERENCE, IPCCC | 2023年
基金
中国国家自然科学基金;
关键词
federated learning; clients selection; clustering; parallel training; distributed machine learning;
D O I
10.1109/IPCCC59175.2023.10253887
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Client selection (a.k.a., client sampling) is one of the hot topics in Federated Learning (FL). In each communication round, selecting some clients to participate in aggregation can effectively reduce the communication overhead caused by exchanging model parameters. However, due to statistical heterogeneity in FL, selecting clients randomly may affect the performance of aggregated global models. existing approaches regarding client selection firstly cluster clients and then sample(select) some representative clients from each cluster. However, these clusteringbased approaches may be either time-intensive or high complexity. To address these issues, In this paper, we introduce Bi-level Sampling, a clustering-based approach for client selection. After multinomial distribution sampling, Bi-level Sampling clusters clients based on weighted per-label mean class scores and then selects participating clients for federated learning in each round. Bi-level Sampling can lead to better client representativity and the reduced variance of the client's stochastic aggregation weights in FL. Our approach can be integrated into typical FL frameworks. Experimental results show that, compared with state-of-the-art approaches, our approach demonstrates significantly more stable and accurate convergence behavior-getting higher test accuracy and less training time, especially in highly Non-IID settings.
引用
收藏
页数:6
相关论文
共 16 条
[1]   GraphCS: Graph-based client selection for heterogeneity in federated learning [J].
Chang, Tao ;
Li, Li ;
Wu, MeiHan ;
Yu, Wei ;
Wang, Xiaodong ;
Xu, ChengZhong .
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2023, 177 :131-143
[2]  
Chen W., 2022, Trans. Mach. Learn. Res
[3]   AUCTION: Automated and Quality-Aware Client Selection Framework for Efficient Federated Learning [J].
Deng, Yongheng ;
Lyu, Feng ;
Ren, Ju ;
Wu, Huaqing ;
Zhou, Yuezhi ;
Zhang, Yaoxue ;
Shen, Xuemin .
IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (08) :1996-2009
[4]  
Fraboni Y, 2021, PR MACH LEARN RES, V139
[5]  
Hsu TMH, 2019, Arxiv, DOI [arXiv:1909.06335, DOI 10.48550/ARXIV.1909.06335]
[6]   Advances and Open Problems in Federated Learning [J].
Kairouz, Peter ;
McMahan, H. Brendan ;
Avent, Brendan ;
Bellet, Aurelien ;
Bennis, Mehdi ;
Bhagoji, Arjun Nitin ;
Bonawitz, Kallista ;
Charles, Zachary ;
Cormode, Graham ;
Cummings, Rachel ;
D'Oliveira, Rafael G. L. ;
Eichner, Hubert ;
El Rouayheb, Salim ;
Evans, David ;
Gardner, Josh ;
Garrett, Zachary ;
Gascon, Adria ;
Ghazi, Badih ;
Gibbons, Phillip B. ;
Gruteser, Marco ;
Harchaoui, Zaid ;
He, Chaoyang ;
He, Lie ;
Huo, Zhouyuan ;
Hutchinson, Ben ;
Hsu, Justin ;
Jaggi, Martin ;
Javidi, Tara ;
Joshi, Gauri ;
Khodak, Mikhail ;
Konecny, Jakub ;
Korolova, Aleksandra ;
Koushanfar, Farinaz ;
Koyejo, Sanmi ;
Lepoint, Tancrede ;
Liu, Yang ;
Mittal, Prateek ;
Mohri, Mehryar ;
Nock, Richard ;
Ozgur, Ayfer ;
Pagh, Rasmus ;
Qi, Hang ;
Ramage, Daniel ;
Raskar, Ramesh ;
Raykova, Mariana ;
Song, Dawn ;
Song, Weikang ;
Stich, Sebastian U. ;
Sun, Ziteng ;
Suresh, Ananda Theertha .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2021, 14 (1-2) :1-210
[7]  
Khaled A, 2020, PR MACH LEARN RES, V108, P4519
[8]  
Li T., 2020, PROC MACH LEARN SYST, V2, P429, DOI DOI 10.48550/ARXIV.1812.06127
[9]  
Li X., 2020, P ICLR, DOI DOI 10.1109/MLBDBI54094.2021.00040
[10]   Energy-Aware, Device-to-Device Assisted Federated Learning in Edge Computing [J].
Li, Yuchen ;
Liang, Weifa ;
Li, Jing ;
Cheng, Xiuzhen ;
Yu, Dongxiao ;
Zomaya, Albert Y. ;
Guo, Song .
IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (07) :2138-2154