Communication-Efficient and Privacy-Preserving Federated Learning via Joint Knowledge Distillation and Differential Privacy in Bandwidth-Constrained Networks

被引:2
|
作者
Gad, Gad [1 ]
Gad, Eyad [1 ]
Fadlullah, Zubair Md [1 ]
Fouda, Mostafa M. [2 ,3 ]
Kato, Nei [4 ]
机构
[1] Western Univ, Dept Comp Sci, London, ON N6G 2V4, Canada
[2] Idaho State Univ, Dept Elect & Comp Engn, Pocatello, ID 83209 USA
[3] Ctr Adv Energy Studies CAES, Idaho Falls, ID 83401 USA
[4] Tohoku Univ, Grad Sch Informat Sci, Sendai 9808577, Japan
基金
加拿大自然科学与工程研究理事会;
关键词
Servers; Data models; Training; Federated learning; Distributed databases; Deep learning; Accuracy; B5G networks; deep learning; differential privacy; federated learning; gradient compression; heterogeneous federated learning; knowledge distillation; INTERNET;
D O I
10.1109/TVT.2024.3423718
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The development of high-quality deep learning models demands the transfer of user data from edge devices, where it originates, to centralized servers. This central training approach has scalability limitations and poses privacy risks to private data. Federated Learning (FL) is a distributed training framework that empowers physical smart systems devices to collaboratively learn a task without sharing private training data with a central server. However, FL introduces new challenges to Beyond 5G (B5G) networks, such as communication overhead, system heterogeneity, and privacy concerns, as the exchange of model updates may still lead to data leakage. This paper explores the communication overhead and privacy risks facing the implementation of FL and presents an algorithm that encompasses Knowledge Distillation (KD) and Differential Privacy (DP) techniques to address these challenges in FL. We compare the operational flow and network model of model-based and model-agnostic (KD-based) FL algorithms that enable customizing per-client model architecture to accommodate heterogeneous and constrained system resources. Our experiments show that KD-based FL algorithms are able to exceed local accuracy and achieve comparable accuracy to central training. Additionally, we show that applying DP to KD-based FL significantly damages its utility, leading to up to 70% accuracy loss for a privacy budget & varepsilon;<= 10 .
引用
收藏
页码:17586 / 17601
页数:16
相关论文
共 50 条
  • [41] Privacy-Preserving Federated Learning via Disentanglement
    Zhou, Wenjie
    Li, Piji
    Han, Zhaoyang
    Lu, Xiaozhen
    Li, Juan
    Ren, Zhaochun
    Liu, Zhe
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3606 - 3615
  • [42] Privacy-preserving Federated Learning for Industrial Defect Detection Systems via Differential Privacy and Image Obfuscation
    Lin, Chia-Yu
    Yeh, Yu-Chen
    Lu, Makena
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 1136 - 1141
  • [43] FL2DP: Privacy-Preserving Federated Learning Via Differential Privacy for Artificial IoT
    Gu, Chen
    Cui, Xuande
    Zhu, Xiaoling
    Hu, Donghui
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (04) : 5100 - 5111
  • [44] Energy-Efficient and Privacy-Preserving Blockchain Based Federated Learning for Smart Healthcare System
    Singh, Moirangthem Biken
    Singh, Himanshu
    Pratap, Ajay
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (05) : 2392 - 2403
  • [45] SAEV: Secure Aggregation and Efficient Verification for Privacy-Preserving Federated Learning
    Wang, Junkai
    Wang, Rong
    Xiong, Ling
    Xiong, Neal
    Liu, Zhicai
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (24): : 39681 - 39696
  • [46] An Efficient and Dynamic Privacy-Preserving Federated Learning System for Edge Computing
    Tang, Xinyu
    Guo, Cheng
    Choo, Kim-Kwang Raymond
    Liu, Yining
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 207 - 220
  • [47] Efficient Privacy-Preserving Federated Learning for Resource-Constrained Edge Devices
    Wu, Jindi
    Xia, Qi
    Li, Qun
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 191 - 198
  • [48] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [49] Efficient and privacy-preserving group signature for federated learning
    Kanchan, Sneha
    Jang, Jae Won
    Yoon, Jun Yong
    Choi, Bong Jun
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 147 : 93 - 106
  • [50] Efficient and Privacy-Preserving Federated Learning with Irregular Users
    Xu, Jieyu
    Li, Hongwei
    Zeng, Jia
    Hao, Meng
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 534 - 539