Adaptive Upgrade of Client Resources for Improving the Quality of Federated Learning Model

被引:16
作者
AbdulRahman, Sawsan [1 ]
Ould-Slimane, Hakima [2 ]
Chowdhury, Rasel [1 ]
Mourad, Azzam [3 ,4 ]
Talhi, Chamseddine [1 ]
Guizani, Mohsen [5 ]
机构
[1] Ecole Technol Sup, Dept Software Engn & IT, Montreal, PQ H3C 1K3, Canada
[2] Universi Quebec Trois Rivieres, Dept Math & Comp Sci, Trois Rivieres, PQ G8Z 4M3, Canada
[3] Lebanese Amer Univ, Cyber Secur Syst & Appl AI Res Ctr, Dept CSM, Beirut, Lebanon
[4] New York Univ Abu Dhabi, Div Sci, Abu Dhabi, U Arab Emirates
[5] Mohamed Bin Zayed Univ Artificial Intelligence, Dept Machine Learning, Abu Dhabi, U Arab Emirates
关键词
Data models; Servers; Internet of Things; Adaptation models; Performance evaluation; Computational modeling; Training; Client selection; federated learning (FL); Internet of Things (IoT); Kubernetes; model significance; resource allocation; COMMUNICATION;
D O I
10.1109/JIOT.2022.3218755
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Conventional systems are usually constrained to store data in a centralized location. This restriction has either precluded sensitive data from being shared or put its privacy on the line. Alternatively, federated learning (FL) has emerged as a promising privacy-preserving paradigm for exchanging model parameters instead of private data of Internet of Things (IoT) devices known as clients. FL trains a global model by communicating local models generated by selected clients throughout many communication rounds until ensuring high learning performance. In these settings, the FL performance highly depends on selecting the best available clients. This process is strongly related to the quality of their models and their training data. Such selection-based schemes have not been explored yet, particularly regarding participating clients having high-quality data yet with limited resources. To address these challenges, we propose in this article FedAUR, a novel approach for an adaptive upgrade of clients resources in FL. We first introduce a method to measure how a locally generated model affects and improves the global model if selected for aggregation without revealing raw data. Next, based on the significance of each client parameters and the resources of their devices, we design a selection scheme that manages and distributes available resources on the server among the appropriate subset of clients. This client selection and resource allocation problem is thus formulated as an optimization problem, where the purpose is to discover and train in each round the maximum number of samples with the highest quality in order to target the desired performance. Moreover, we present a Kubernetes-based prototype that we implemented to evaluate the performance of the proposed approach.
引用
收藏
页码:4677 / 4687
页数:11
相关论文
共 41 条
  • [1] A machine learning model for improving healthcare services on cloud computing environment
    Abdelaziz, Ahmed
    Elhoseny, Mohamed
    Salama, Ahmed S.
    Riad, A. M.
    [J]. MEASUREMENT, 2018, 119 : 117 - 128
  • [2] Analog Compression and Communication for Federated Learning over Wireless MAC
    Abdi, Afshin
    Saidutta, Yashas Malur
    Fekri, Faramarz
    [J]. PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [3] A Survey on Federated Learning: The Journey From Centralized to Distributed On-Site Learning and Beyond
    AbdulRahman, Sawsan
    Tout, Hanine
    Ould-Slimane, Hakima
    Mourad, Azzam
    Talhi, Chamseddine
    Guizani, Mohsen
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (07): : 5476 - 5497
  • [4] FedMCCS: Multicriteria Client Selection Model for Optimal IoT Federated Learning
    AbdulRahman, Sawsan
    Tout, Hanine
    Mourad, Azzam
    Talhi, Chamseddine
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (06) : 4723 - 4735
  • [5] Client Selection Approach in Support of Clustered Federated Learning over Wireless Edge Networks
    Albaseer, Abdullatif
    Abdallah, Mohamed
    Al-Fuqaha, Ala
    Erbad, Aiman
    [J]. 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [6] Federated Learning Over Wireless Fading Channels
    Amiri, Mohammad Mohammadi
    Gunduz, Deniz
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2020, 19 (05) : 3546 - 3557
  • [7] [Anonymous], KUBERNETES
  • [8] A Joint Learning and Communications Framework for Federated Learning Over Wireless Networks
    Chen, Mingzhe
    Yang, Zhaohui
    Saad, Walid
    Yin, Changchuan
    Poor, H. Vincent
    Cui, Shuguang
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2021, 20 (01) : 269 - 283
  • [9] Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation
    Chen, Yang
    Sun, Xiaoyan
    Jin, Yaochu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) : 4229 - 4238
  • [10] Federated Transfer Learning With Client Selection for Intrusion Detection in Mobile Edge Computing
    Cheng, Yanyu
    Lu, Jianyuan
    Niyato, Dusit
    Lyu, Biao
    Kang, Jiawen
    Zhu, Shunmin
    [J]. IEEE COMMUNICATIONS LETTERS, 2022, 26 (03) : 552 - 556