Adaptive client selection and model aggregation for heterogeneous federated learning

被引:4
作者
Zhai, Rui [1 ,2 ]
Jin, Haozhe [1 ,2 ]
Gong, Wei [4 ,5 ]
Lu, Ke [1 ,3 ]
Liu, Yanhong [1 ,2 ]
Song, Yalin [1 ,2 ]
Yu, Junyang [1 ,2 ]
机构
[1] Henan Univ, Sch Software, Kaifeng 475004, Henan, Peoples R China
[2] Henan Univ, Henan Prov Engn Res Ctr Intelligent Data Proc, Kaifeng 475004, Henan, Peoples R China
[3] Univ Chinese Acad Sci, Sch Engn Sci, Beijing 100049, Peoples R China
[4] Tongji Univ, Dept Control Sci & Engn, Shanghai 201804, Peoples R China
[5] Tongji Univ, Shanghai Res Inst Intelligent Autonomous Syst, Shanghai 201210, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; Client selection; Multi-armed bandit; Model aggregation;
D O I
10.1007/s00530-024-01386-w
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) is a distributed machine learning method that allows multiple clients to collaborate on model training without sharing raw data. However, FL faces challenges with data heterogeneity, leading to reduced model accuracy and slower convergence. Although existing client selection methods can alleviate the above problems, there is still room to improve FL performance. To tackle these problems, we first propose a novel client selection method based on Multi-Armed Bandit (MAB). The method uses the historical training information uploaded by each client to calculate its correlation and contribution. The calculated values are then used to select a set of clients that can bring the most benefit, i.e., maximizing both model accuracy and convergence speed. Second, we propose an adaptive global model aggregation method that utilizes the local training information of selected clients to dynamically assign weights to local model parameters. Extensive experiments on various datasets with different heterogeneous settings demonstrate that our proposed method is effectively improving FL performance compared to several benchmarks.
引用
收藏
页数:15
相关论文
共 56 条
[1]  
Balakrishnan R., 2022, INT C LEARN REPR ICL
[2]   Federated Learning Aggregation: New Robust Algorithms with Guarantees [J].
Ben Mansour, Adnan ;
Carenini, Gaia ;
Duplessis, Alexandre ;
Naccache, David .
2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, :721-726
[3]  
Cao H., 2022, P INT WORKSH TRUST V, P1
[4]   Dynamic Aggregation for Heterogeneous Quantization in Federated Learning [J].
Chen, Shengbo ;
Shen, Cong ;
Zhang, Lanxue ;
Tang, Yuanmin .
IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (10) :6804-6819
[5]   Joint Client Selection and Task Assignment for Multi-Task Federated Learning in MEC Networks [J].
Cheng, Zhipeng ;
Min, Minghui ;
Liwang, Minghui ;
Gao, Zhibin ;
Huang, Lianfen .
2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
[6]  
Cho YJ, 2022, PR MACH LEARN RES, V151
[7]   Bandit-based Communication-Efficient Client Selection Strategies for Federated Learning [J].
Cho, Yae Jee ;
Gupta, Samarth ;
Joshi, Gauri ;
Yagan, Osman .
2020 54TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2020, :1066-1069
[8]   FAIR: Quality-Aware Federated Learning with Precise User Incentive and Model Aggregation [J].
Deng, Yongheng ;
Lyu, Feng ;
Ren, Ju ;
Chen, Yi-Chao ;
Yang, Peng ;
Zhou, Yuezhi ;
Zhang, Yaoxue .
IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2021), 2021,
[9]   Astraea: Self-balancing Federated Learning for Improving Classification Accuracy of Mobile Deep Learning Applications [J].
Duan, Moming ;
Liu, Duo ;
Chen, Xianzhang ;
Tan, Yujuan ;
Ren, Jinting ;
Qiao, Lei ;
Liang, Liang .
2019 IEEE 37TH INTERNATIONAL CONFERENCE ON COMPUTER DESIGN (ICCD 2019), 2019, :246-254
[10]  
Fa Xin, 2022, 2022 IEEE 25th International Conference on Computer Supported Cooperative Work in Design (CSCWD), P1239, DOI 10.1109/CSCWD54268.2022.9776061