FLCP: federated learning framework with communication-efficient and privacy-preserving

被引:0
|
作者
Yang, Wei [1 ]
Yang, Yuan [1 ]
Xi, Yingjie [2 ]
Zhang, Hailong [2 ]
Xiang, Wei [3 ,4 ]
机构
[1] Xian Univ Technol, Sch Automat & Informat Engn, Xian 710048, Peoples R China
[2] Xian Univ Technol, Sch Int Engn, Xian 710048, Peoples R China
[3] La Trobe Univ, Sch Comp Engn & Math Sci, Melbourne 3086, Australia
[4] James Cook Univ, Coll Sci & Engn, Cairns, Qld 4870, Australia
基金
中国国家自然科学基金;
关键词
Artificial intelligence; Federated learning; Privacy protection; Communication efficiency; DIFFERENTIAL PRIVACY; ENCRYPTION;
D O I
10.1007/s10489-024-05521-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Within the federated learning (FL) framework, the client collaboratively trains the model in coordination with a central server, while the training data can be kept locally on the client. Thus, the FL framework mitigates the privacy disclosure and costs related to conventional centralized machine learning. Nevertheless, current surveys indicate that FL still has problems in terms of communication efficiency and privacy risks. In this paper, to solve these problems, we develop an FL framework with communication-efficient and privacy-preserving (FLCP). To realize the FLCP, we design a novel compression algorithm with efficient communication, namely, adaptive weight compression FedAvg (AWC-FedAvg). On the basis of the non-independent and identically distributed (non-IID) and unbalanced data distribution in FL, a specific compression rate is provided for each client, and homomorphic encryption (HE) and differential privacy (DP) are integrated to provide demonstrable privacy protection and maintain the desirability of the model. Therefore, our proposed FLCP smoothly balances communication efficiency and privacy risks, and we prove its security against "honest-but-curious" servers and extreme collusion under the defined threat model. We evaluate the scheme by comparing it with state-of-the-art results on the MNIST and CIFAR-10 datasets. The results show that the FLCP performs better in terms of training efficiency and model accuracy than the baseline method.
引用
收藏
页码:6816 / 6835
页数:20
相关论文
共 50 条
  • [1] Communication-Efficient Personalized Federated Learning With Privacy-Preserving
    Wang, Qian
    Chen, Siguang
    Wu, Meng
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2374 - 2388
  • [2] Communication-Efficient and Privacy-Preserving Aggregation in Federated Learning With Adaptability
    Sun, Xuehua
    Yuan, Zengsen
    Kong, Xianguang
    Xue, Liang
    He, Lang
    Lin, Ying
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (15): : 26430 - 26443
  • [3] Communication-Efficient and Privacy-Preserving Verifiable Aggregation for Federated Learning
    Peng, Kaixin
    Shen, Xiaoying
    Gao, Le
    Wang, Baocang
    Lu, Yichao
    ENTROPY, 2023, 25 (08)
  • [4] Privacy-preserving and communication-efficient federated learning in Internet of Things
    Fang, Chen
    Guo, Yuanbo
    Hu, Yongjin
    Ma, Bowen
    Feng, Li
    Yin, Anqi
    COMPUTERS & SECURITY, 2021, 103 (103)
  • [5] Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning
    Wang, Feng
    Gursoy, M. Cenk
    Velipasalar, Senem
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3875 - 3880
  • [6] Communication-efficient and privacy-preserving large-scale federated learning counteracting heterogeneity
    Zhou, Xingcai
    Yang, Guang
    INFORMATION SCIENCES, 2024, 661
  • [7] Privacy-preserving and communication-efficient stochastic alternating direction method of multipliers for federated learning
    Zhang, Yi
    Lu, Yunfan
    Liu, Fengxia
    Li, Cheng
    Gong, Zixian
    Hu, Zhe
    Xu, Qun
    INFORMATION SCIENCES, 2025, 691
  • [8] FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning
    Elgabli, Anis
    Issaid, Chaouki B.
    Bedi, Amrit S.
    Rajawat, Ketan
    Bennis, Mehdi
    Aggarwal, Vaneet
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [9] Privacy-Preserving Communication-Efficient Federated Multi-Armed Bandits
    Li, Tan
    Song, Linqi
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2022, 40 (03) : 773 - 787
  • [10] Communication-Efficient Privacy-Preserving Clustering
    Jagannathan, Geetha
    Pillaipakkamnatt, Krishnan
    Wright, Rebecca N.
    Umano, Daryl
    TRANSACTIONS ON DATA PRIVACY, 2010, 3 (01) : 2 - 26