Communication-Efficient and Privacy-Preserving Federated Learning via Joint Knowledge Distillation and Differential Privacy in Bandwidth-Constrained Networks

被引:2
|
作者
Gad, Gad [1 ]
Gad, Eyad [1 ]
Fadlullah, Zubair Md [1 ]
Fouda, Mostafa M. [2 ,3 ]
Kato, Nei [4 ]
机构
[1] Western Univ, Dept Comp Sci, London, ON N6G 2V4, Canada
[2] Idaho State Univ, Dept Elect & Comp Engn, Pocatello, ID 83209 USA
[3] Ctr Adv Energy Studies CAES, Idaho Falls, ID 83401 USA
[4] Tohoku Univ, Grad Sch Informat Sci, Sendai 9808577, Japan
基金
加拿大自然科学与工程研究理事会;
关键词
Servers; Data models; Training; Federated learning; Distributed databases; Deep learning; Accuracy; B5G networks; deep learning; differential privacy; federated learning; gradient compression; heterogeneous federated learning; knowledge distillation; INTERNET;
D O I
10.1109/TVT.2024.3423718
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The development of high-quality deep learning models demands the transfer of user data from edge devices, where it originates, to centralized servers. This central training approach has scalability limitations and poses privacy risks to private data. Federated Learning (FL) is a distributed training framework that empowers physical smart systems devices to collaboratively learn a task without sharing private training data with a central server. However, FL introduces new challenges to Beyond 5G (B5G) networks, such as communication overhead, system heterogeneity, and privacy concerns, as the exchange of model updates may still lead to data leakage. This paper explores the communication overhead and privacy risks facing the implementation of FL and presents an algorithm that encompasses Knowledge Distillation (KD) and Differential Privacy (DP) techniques to address these challenges in FL. We compare the operational flow and network model of model-based and model-agnostic (KD-based) FL algorithms that enable customizing per-client model architecture to accommodate heterogeneous and constrained system resources. Our experiments show that KD-based FL algorithms are able to exceed local accuracy and achieve comparable accuracy to central training. Additionally, we show that applying DP to KD-based FL significantly damages its utility, leading to up to 70% accuracy loss for a privacy budget & varepsilon;<= 10 .
引用
收藏
页码:17586 / 17601
页数:16
相关论文
共 50 条
  • [1] Communication-Efficient Personalized Federated Learning With Privacy-Preserving
    Wang, Qian
    Chen, Siguang
    Wu, Meng
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2374 - 2388
  • [2] Communication-Efficient Privacy-Preserving Federated Learning via Knowledge Distillation for Human Activity Recognition Systems
    Gad, Gad
    Fadlullah, Zubair Md
    Rabie, Khaled
    Fouda, Mostafa M.
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1572 - 1578
  • [3] Joint Knowledge Distillation and Local Differential Privacy for Communication-Efficient Federated Learning in Heterogeneous Systems
    Gad, Gad
    Fadlullah, Zubair Md
    Fouda, Mostafa M.
    Ibrahem, Mohamed I.
    Nasser, Nidal
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2051 - 2056
  • [4] FLCP: federated learning framework with communication-efficient and privacy-preserving
    Yang, Wei
    Yang, Yuan
    Xi, Yingjie
    Zhang, Hailong
    Xiang, Wei
    APPLIED INTELLIGENCE, 2024, 54 (9-10) : 6816 - 6835
  • [5] CRS-FL: Conditional Random Sampling for Communication-Efficient and Privacy-Preserving Federated Learning
    Wang, Jianhua
    Chang, Xiaolin
    Misic, Jelena
    Misic, Vojislav B.
    Li, Lin
    Yao, Yingying
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2025, 22 (01): : 198 - 208
  • [6] FedMDO: Privacy-Preserving Federated Learning via Mixup Differential Objective
    You, Xianyao
    Liu, Caiyun
    Li, Jun
    Sun, Yan
    Liu, Ximeng
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (10) : 10449 - 10463
  • [7] Privacy-Preserving Communication-Efficient Federated Multi-Armed Bandits
    Li, Tan
    Song, Linqi
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2022, 40 (03) : 773 - 787
  • [8] Communication-Efficient and Privacy-Preserving Verifiable Aggregation for Federated Learning
    Peng, Kaixin
    Shen, Xiaoying
    Gao, Le
    Wang, Baocang
    Lu, Yichao
    ENTROPY, 2023, 25 (08)
  • [9] Privacy-preserving and communication-efficient federated learning in Internet of Things
    Fang, Chen
    Guo, Yuanbo
    Hu, Yongjin
    Ma, Bowen
    Feng, Li
    Yin, Anqi
    COMPUTERS & SECURITY, 2021, 103 (103)
  • [10] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982