Contribution prediction in federated learning via client behavior evaluation

被引:1
|
作者
Al-Saedi, Ahmed A. [1 ]
Boeva, Veselka [1 ]
Casalicchio, Emiliano [1 ,2 ]
机构
[1] Blekinge Inst Technol, Dept Comp Sci, SE-37179 Karlskrona, Sweden
[2] Sapienza Univ Rome, Dept Comp Sci, I-00185 Rome, Italy
关键词
Federated learning; Contribution evaluation; Clustering analysis; Eccentricity analysis; Behavior monitoring;
D O I
10.1016/j.future.2024.107639
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated learning (FL), a decentralized machine learning framework that allows edge devices (i.e., clients) to train a global model while preserving data/client privacy, has become increasingly popular recently. In FL, a shared global model is built by aggregating the updated parameters in a distributed manner. To incentivize data owners to participate in FL, it is essential for service providers to fairly evaluate the contribution of each data owner to the shared model during the learning process. To the best of our knowledge, most existing solutions are resource-demanding and usually run as an additional evaluation procedure. The latter produces an expensive computational cost for large data owners. In this paper, we present simple and effective FL solutions that show how the clients' behavior can be evaluated during the training process with respect to reliability, and this is demonstrated for two existing FL models, Cluster Analysis-based Federated Learning (CA-FL) and Group-Personalized FL (GP-FL), respectively. In the former model, CA-FL, the frequency of each client to be selected as a cluster representative and in that way to be involved in the building of the shared model is assessed. This can eventually be considered as a measure of the respective client data reliability. In the latter model, GP-FL, we calculate how many times each client changes a cluster it belongs to during FL training, which can be interpreted as a measure of the client's unstable behavior, i.e., it can be considered as not very reliable. We validate our FL approaches on three LEAF datasets and benchmark their performance to two baseline contribution evaluation approaches. The experimental results demonstrate that by applying the two FL models we are able to get robust evaluations of clients' behavior during the training process. These evaluations can be used for further studying, comparing, understanding, and eventually predicting clients' contributions to the shared global model.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] FairFed: Improving Fairness and Efficiency of Contribution Evaluation in Federated Learning via Cooperative Shapley Value
    Liu, Yiqi
    Chang, Shan
    Liu, Ye
    Li, Bo
    Wang, Cong
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2024, : 621 - 630
  • [22] Federated Learning with Client Availability Budgets
    Bao, Yunkai
    Drew, Steve
    Wang, Xin
    Zhou, Jiayu
    Niu, Xiaoguang
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1902 - 1907
  • [23] Reuse of Client Models in Federated Learning
    Cao, Bokai
    Wu, Weigang
    Zhan, Congcong
    Zhou, Jieying
    2022 IEEE INTERNATIONAL CONFERENCE ON SMART COMPUTING (SMARTCOMP 2022), 2022, : 356 - 361
  • [24] Client Selection in Hierarchical Federated Learning
    Trindade, Silvana
    da Fonseca, Nelson L. S.
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28480 - 28495
  • [25] Client Selection for Federated Bayesian Learning
    Yang, Jiarong
    Liu, Yuan
    Kassab, Rahif
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 915 - 928
  • [26] AdaBest: Minimizing Client Drift in Federated Learning via Adaptive Bias Estimation
    Varno, Farshid
    Saghayi, Marzie
    Sevyeri, Laya Rafiee
    Gupta, Sharut
    Matwin, Stan
    Havaei, Mohammad
    COMPUTER VISION, ECCV 2022, PT XXIII, 2022, 13683 : 710 - 726
  • [27] Efficient Participant Contribution Evaluation for Horizontal and Vertical Federated Learning
    Wang, Junhao
    Zhang, Lan
    Li, Anran
    You, Xuanke
    Cheng, Haoran
    2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, : 911 - 923
  • [28] Adaptive client and communication optimizations in Federated Learning
    Wu, Jiagao
    Wang, Yu
    Shen, Zhangchi
    Liu, Linfeng
    INFORMATION SYSTEMS, 2023, 116
  • [29] Client Selection with Bandwidth Allocation in Federated Learning
    Kuang, Junqian
    Yang, Miao
    Zhu, Hongbin
    Qian, Hua
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [30] Online Client Scheduling for Fast Federated Learning
    Xu, Bo
    Xia, Wenchao
    Zhang, Jun
    Quek, Tony Q. S.
    Zhu, Hongbo
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2021, 10 (07) : 1434 - 1438