Contribution prediction in federated learning via client behavior evaluation

被引:1
|
作者
Al-Saedi, Ahmed A. [1 ]
Boeva, Veselka [1 ]
Casalicchio, Emiliano [1 ,2 ]
机构
[1] Blekinge Inst Technol, Dept Comp Sci, SE-37179 Karlskrona, Sweden
[2] Sapienza Univ Rome, Dept Comp Sci, I-00185 Rome, Italy
关键词
Federated learning; Contribution evaluation; Clustering analysis; Eccentricity analysis; Behavior monitoring;
D O I
10.1016/j.future.2024.107639
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated learning (FL), a decentralized machine learning framework that allows edge devices (i.e., clients) to train a global model while preserving data/client privacy, has become increasingly popular recently. In FL, a shared global model is built by aggregating the updated parameters in a distributed manner. To incentivize data owners to participate in FL, it is essential for service providers to fairly evaluate the contribution of each data owner to the shared model during the learning process. To the best of our knowledge, most existing solutions are resource-demanding and usually run as an additional evaluation procedure. The latter produces an expensive computational cost for large data owners. In this paper, we present simple and effective FL solutions that show how the clients' behavior can be evaluated during the training process with respect to reliability, and this is demonstrated for two existing FL models, Cluster Analysis-based Federated Learning (CA-FL) and Group-Personalized FL (GP-FL), respectively. In the former model, CA-FL, the frequency of each client to be selected as a cluster representative and in that way to be involved in the building of the shared model is assessed. This can eventually be considered as a measure of the respective client data reliability. In the latter model, GP-FL, we calculate how many times each client changes a cluster it belongs to during FL training, which can be interpreted as a measure of the client's unstable behavior, i.e., it can be considered as not very reliable. We validate our FL approaches on three LEAF datasets and benchmark their performance to two baseline contribution evaluation approaches. The experimental results demonstrate that by applying the two FL models we are able to get robust evaluations of clients' behavior during the training process. These evaluations can be used for further studying, comparing, understanding, and eventually predicting clients' contributions to the shared global model.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] EFFICIENT CLIENT CONTRIBUTION EVALUATION FOR HORIZONTAL FEDERATED LEARNING
    Zhao, Jie
    Zhu, Xinghua
    Wang, Jianzong
    Xiao, Jing
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3060 - 3064
  • [2] ADAFL: ADAPTIVE CLIENT SELECTION AND DYNAMIC CONTRIBUTION EVALUATION FOR EFFICIENT FEDERATED LEARNING
    Li, Qingming
    Li, Xiaohang
    Zhou, Li
    Yan, Xiaoran
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 6645 - 6649
  • [3] Maverick Matters: Client Contribution and Selection in Federated Learning
    Huang, Jiyue
    Hong, Chi
    Liu, Yang
    Chen, Lydia Y.
    Roos, Stefanie
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT II, 2023, 13936 : 269 - 282
  • [4] Contribution-based Federated Learning client selection
    Lin, Weiwei
    Xu, Yinhai
    Liu, Bo
    Li, Dongdong
    Huang, Tiansheng
    Shi, Fang
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (10) : 7235 - 7260
  • [5] Survey on Contribution Evaluation for Federated Learning
    Wang Y.
    Li G.-L.
    Li K.-Y.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (03): : 1168 - 1192
  • [6] Fair Federated Medical Image Segmentation via Client Contribution Estimation
    Jiang, Meirui
    Roth, Holger R.
    Li, Wenqi
    Yang, Dong
    Zhao, Can
    Nath, Vishwesh
    Xu, Daguang
    Dou, Qi
    Xu, Ziyue
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16302 - 16311
  • [7] Adaptive Control of Client Contribution and Batch Size for Efficient Federated Learning
    Ouyang, Jinhao
    Liu, Yuan
    2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 1286 - 1291
  • [8] Empirical Measurement of Client Contribution for Federated Learning With Data Size Diversification
    Shyn, Sung Kuk
    Kim, Donghee
    Kim, Kwangsu
    IEEE ACCESS, 2022, 10 : 118563 - 118574
  • [9] Auxo: Efficient Federated Learning via Scalable Client Clustering
    Liu, Jiachen
    Lai, Fan
    Dai, Yinwei
    Akella, Aditya
    Madhyastha, Harsha V.
    Chowdhury, Mosharaf
    PROCEEDINGS OF THE 2023 ACM SYMPOSIUM ON CLOUD COMPUTING, SOCC 2023, 2023, : 125 - 141
  • [10] Reducing communication in federated learning via efficient client sampling
    Ribero, Monica
    Vikalo, Haris
    PATTERN RECOGNITION, 2024, 148