GraphCS: Graph-based client selection for heterogeneity in federated learning

被引:5
|
作者
Chang, Tao [1 ]
Li, Li [2 ]
Wu, MeiHan [1 ]
Yu, Wei [3 ]
Wang, Xiaodong [1 ]
Xu, ChengZhong [2 ]
机构
[1] Natl Univ Def Technol, Coll Comp, Key Lab Parallel & Distributed Comp, Changsha, Peoples R China
[2] Univ Macau, State Key Lab Internet Things Smart City, Taipa, Peoples R China
[3] China Elect Technol Grp Corp, Res Inst 30, Chengdu, Peoples R China
关键词
Federated learning; Client selection; Heterogeneity; ALGORITHMS;
D O I
10.1016/j.jpdc.2023.03.003
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning coordinates many mobile devices to train an artificial intelligence model while preserving data privacy collaboratively. Mobile devices are usually equipped with totally different hardware configurations, leading to various training capabilities. At the same time, the distribution of the local training data is highly heterogeneous across different clients. Randomly selecting the clients to participate in the training process results in poor model performance and low system efficiency. In this paper, we propose GraphCS, a graph-based client selection framework for heterogeneity in Federated Learning. GraphCS first measures the distribution coupling across the clients via the model gradients. After that, it divides the clients into different groups according to the diversity of the local datasets. At the same time, it well estimates the runtime training capability of each client by jointly considering the hardware configuration and resource contention caused by the concurrently running apps. With the distribution coupling information and runtime training capability, GraphCS selects the best clients in order to well balance the model accuracy and overall training progress. We evaluate the performance of GraphCS with mobile devices with different hardware configurations on various datasets. The experiment results show that our approach improves model accuracy up to 45.69%. Meanwhile, it reduces communication and computation overhead 87.35% and 89.48% at best, respectively. Furthermore, GraphCS accelerates the overall training process up to 35x. (c) 2023 Elsevier Inc. All rights reserved.
引用
收藏
页码:131 / 143
页数:13
相关论文
共 50 条
  • [41] Adaptive client selection with personalization for communication efficient Federated Learning
    de Souza, Allan M.
    Maciel, Filipe
    da Costa, Joahannes B. D.
    Bittencourt, Luiz F.
    Cerqueira, Eduardo
    Loureiro, Antonio A. F.
    Villas, Leandro A.
    AD HOC NETWORKS, 2024, 157
  • [42] RingSFL: An Adaptive Split Federated Learning Towards Taming Client Heterogeneity
    Shen, Jinglong
    Cheng, Nan
    Wang, Xiucheng
    Lyu, Feng
    Xu, Wenchao
    Liu, Zhi
    Aldubaikhy, Khalid
    Shen, Xuemin
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (05) : 5462 - 5478
  • [43] Federated Learning with Personalized Differential Privacy Combining Client Selection
    Xie, Yunting
    Zhang, Lan
    2022 8TH INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING AND COMMUNICATIONS, BIGCOM, 2022, : 79 - 87
  • [44] Towards Instant Clustering Approach for Federated Learning Client Selection
    Arisdakessian, Sarhad
    Wahab, Omar Abdel
    Mourad, Azzam
    Otrok, Hadi
    2023 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC, 2023, : 409 - 413
  • [45] Review on Research Trends of Optimization for Client Selection in Federated Learning
    Kim, Jaemin
    Song, Chihyun
    Paek, Jeongyeup
    Kwon, Jung-Hyok
    Cho, Sungrae
    38TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN 2024, 2024, : 287 - 289
  • [46] A Comprehensive Overview of IoT-Based Federated Learning: Focusing on Client Selection Methods
    Khajehali, Naghmeh
    Yan, Jun
    Chow, Yang-Wai
    Fahmideh, Mahdi
    SENSORS, 2023, 23 (16)
  • [47] FL-MAB: Client Selection and Monetization for Blockchain-Based Federated Learning
    Batool, Zahra
    Zhang, Kaiwen
    Toews, Matthew
    37TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, 2022, : 299 - 307
  • [48] Pretraining Client Selection Algorithm Based on a Data Distribution Evaluation Model in Federated Learning
    Xu, Chang
    Liu, Hong
    Li, Kexin
    Feng, Wanglei
    Qi, Wei
    IEEE ACCESS, 2024, 12 : 63958 - 63966
  • [49] Reputation-Aware Federated Learning Client Selection Based on Stochastic Integer Programming
    Tan, Xavier
    Ng, Wei
    Lim, Wei
    Xiong, Zehui
    Niyato, Dusit
    Yu, Han
    IEEE TRANSACTIONS ON BIG DATA, 2024, 10 (06) : 953 - 964
  • [50] Bandit-based Communication-Efficient Client Selection Strategies for Federated Learning
    Cho, Yae Jee
    Gupta, Samarth
    Joshi, Gauri
    Yagan, Osman
    2020 54TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2020, : 1066 - 1069