Communication-Efficient Personalized Federated Learning With Privacy-Preserving

被引:3
作者
Wang, Qian [1 ]
Chen, Siguang [1 ]
Wu, Meng [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Internet Things, Nanjing 210003, Peoples R China
来源
IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT | 2024年 / 21卷 / 02期
基金
中国国家自然科学基金;
关键词
Adaptation models; Computational modeling; Costs; Training; Privacy; Servers; Data models; Federated learning; knowledge distillation; feature fusion; privacy-preserving; gradient compression;
D O I
10.1109/TNSM.2023.3323129
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) gets a sound momentum of growth, which is widely applied to train model in the distributed scenario. However, huge communication cost, poor performance under heterogeneous datasets and models, and emerging privacy leakage are major problems of FL. In this paper, we propose a communication-efficient personalized FL scheme with privacy-preserving. Firstly, we develop a personalized FL with feature fusion-based mutual-learning, which can achieve communication-efficient and personalized learning by training the shared model, private model and fusion model reciprocally on the client. Specifically, only the shared model is shared with global model to reduce communication cost, the private model can be personalized, and the fusion model can fuse the local and global knowledge adaptively in different stages. Secondly, to further reduce the communication cost and enhance the privacy of gradients, we design a privacy-preserving method with gradient compression. In this method, we construct a chaotic encrypted cyclic measurement matrix, which can achieve well privacy protection and lightweight compression. Moreover, we present a sparsity-based adaptive iterative hard threshold algorithm to improve the flexibility and reconstruction performance. Finally, we perform extensive experiments on different datasets and models, and the results show that our scheme achieves more competitive results than other benchmarks on model performance and privacy.
引用
收藏
页码:2374 / 2388
页数:15
相关论文
共 54 条
[1]   Iterative hard thresholding for compressed sensing [J].
Blumensath, Thomas ;
Davies, Mike E. .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2009, 27 (03) :265-274
[2]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[3]  
Cetinkaya A.E., 2021, 2021 6 INT C COMPUTE, P429
[4]   FedHe: Heterogeneous Models and Communication-Efficient Federated Learning [J].
Chan, Yun Hin ;
Ngai, Edith C. H. .
2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, :207-214
[5]   Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation [J].
Chen, Yang ;
Sun, Xiaoyan ;
Jin, Yaochu .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) :4229-4238
[6]   Privacy-Preserving and Byzantine-Robust Federated Learning [J].
Dong, Caiqin ;
Weng, Jian ;
Li, Ming ;
Liu, Jia-Nan ;
Liu, Zhiquan ;
Cheng, Yudan ;
Yu, Shui .
IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (02) :889-904
[7]  
Fan X., 2021, arXiv
[8]   Ensemble Attention Distillation for Privacy-Preserving Federated Learning [J].
Gong, Xuan ;
Sharma, Abhishek ;
Karanam, Srikrishna ;
Wu, Ziyan ;
Chen, Terrence ;
Doermann, David ;
Innanje, Arun .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :15056-15066
[9]   VeriFL: Communication-Efficient and Fast Verifiable Aggregation for Federated Learning [J].
Guo, Xiaojie ;
Liu, Zheli ;
Li, Jin ;
Gao, Jiqiang ;
Hou, Boyu ;
Dong, Changyu ;
Baker, Thar .
IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2021, 16 :1736-1751
[10]  
Haiyan Cui, 2021, 2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS), P423, DOI 10.1109/CCIS53392.2021.9754651