Communication-Efficient and Privacy-Preserving Federated Learning via Joint Knowledge Distillation and Differential Privacy in Bandwidth-Constrained Networks

被引:2
|
作者
Gad, Gad [1 ]
Gad, Eyad [1 ]
Fadlullah, Zubair Md [1 ]
Fouda, Mostafa M. [2 ,3 ]
Kato, Nei [4 ]
机构
[1] Western Univ, Dept Comp Sci, London, ON N6G 2V4, Canada
[2] Idaho State Univ, Dept Elect & Comp Engn, Pocatello, ID 83209 USA
[3] Ctr Adv Energy Studies CAES, Idaho Falls, ID 83401 USA
[4] Tohoku Univ, Grad Sch Informat Sci, Sendai 9808577, Japan
基金
加拿大自然科学与工程研究理事会;
关键词
Servers; Data models; Training; Federated learning; Distributed databases; Deep learning; Accuracy; B5G networks; deep learning; differential privacy; federated learning; gradient compression; heterogeneous federated learning; knowledge distillation; INTERNET;
D O I
10.1109/TVT.2024.3423718
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The development of high-quality deep learning models demands the transfer of user data from edge devices, where it originates, to centralized servers. This central training approach has scalability limitations and poses privacy risks to private data. Federated Learning (FL) is a distributed training framework that empowers physical smart systems devices to collaboratively learn a task without sharing private training data with a central server. However, FL introduces new challenges to Beyond 5G (B5G) networks, such as communication overhead, system heterogeneity, and privacy concerns, as the exchange of model updates may still lead to data leakage. This paper explores the communication overhead and privacy risks facing the implementation of FL and presents an algorithm that encompasses Knowledge Distillation (KD) and Differential Privacy (DP) techniques to address these challenges in FL. We compare the operational flow and network model of model-based and model-agnostic (KD-based) FL algorithms that enable customizing per-client model architecture to accommodate heterogeneous and constrained system resources. Our experiments show that KD-based FL algorithms are able to exceed local accuracy and achieve comparable accuracy to central training. Additionally, we show that applying DP to KD-based FL significantly damages its utility, leading to up to 70% accuracy loss for a privacy budget & varepsilon;<= 10 .
引用
收藏
页码:17586 / 17601
页数:16
相关论文
共 50 条
  • [31] Efficient Verifiable Protocol for Privacy-Preserving Aggregation in Federated Learning
    Eltaras, Tamer
    Sabry, Farida
    Labda, Wadha
    Alzoubi, Khawla
    Malluhi, Qutaibah
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2023, 18 : 2977 - 2990
  • [32] Privacy-Preserving Efficient Federated-Learning Model Debugging
    Li, Anran
    Zhang, Lan
    Wang, Junhao
    Han, Feng
    Li, Xiang-Yang
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (10) : 2291 - 2303
  • [33] Communication-efficient Federated Learning for UAV Networks with Knowledge Distillation and Transfer Learning
    Li, Yalong
    Wu, Celimuge
    Du, Zhaoyang
    Zhong, Lei
    Yoshinaga, Tsutomu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5739 - 5744
  • [34] Federated Learning With Privacy-Preserving Ensemble Attention Distillation
    Gong, Xuan
    Song, Liangchen
    Vedula, Rishi
    Sharma, Abhishek
    Zheng, Meng
    Planche, Benjamin
    Innanje, Arun
    Chen, Terrence
    Yuan, Junsong
    Doermann, David
    Wu, Ziyan
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (07) : 2057 - 2067
  • [35] Privacy-Preserving Federated Learning via Functional Encryption, Revisited
    Chang, Yansong
    Zhang, Kai
    Gong, Junqing
    Qian, Haifeng
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2023, 18 : 1855 - 1869
  • [36] Efficient Privacy-Preserving Federated Learning With Unreliable Users
    Li, Yiran
    Li, Hongwei
    Xu, Guowen
    Huang, Xiaoming
    Lu, Rongxing
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (13) : 11590 - 11603
  • [37] Privacy-Preserving Federated Learning for Data Heterogeneity in 6G Mobile Networks
    Zhang, Chuan
    Ren, Xuhao
    Zhang, Weiting
    Yuan, Yanli
    Xiong, Zehui
    Li, Chunhai
    Zhu, Liehuang
    IEEE NETWORK, 2025, 39 (02): : 134 - 141
  • [38] PCFed: Privacy-Enhanced and Communication-Efficient Federated Learning for Industrial IoTs
    Han, Qing
    Yang, Shusen
    Ren, Xuebin
    Zhao, Peng
    Zhao, Cong
    Wang, Yimeng
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (09) : 6181 - 6191
  • [39] COFEL: Communication-Efficient and Optimized Federated Learning with Local Differential Privacy
    Lian, Zhuotao
    Wang, Weizheng
    Su, Chunhua
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [40] FedGKD: Federated Graph Knowledge Distillation for privacy-preserving rumor detection
    Zheng, Peng
    Dou, Yong
    Yan, Yeqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304