FedDP-SA: Boosting Differentially Private Federated Learning via Local Data Set Splitting

被引:0
|
作者
Liu, Xuezheng [1 ]
Zhou, Yipeng [2 ]
Wu, Di [1 ]
Hu, Miao [1 ]
Hui Wang, Jessie [3 ,4 ]
Guizani, Mohsen [5 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangdong Key Lab Big Data Anal & Proc, Guangzhou 510006, Peoples R China
[2] Macquarie Univ, Fac Sci & Engn, Dept Comp, Sydney, NSW 2109, Australia
[3] Tsinghua Univ, Inst Network Sci & Cyberspace, Beijing 100084, Peoples R China
[4] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R China
[5] Mohamed bin Zayed Univ Artificial Intelligence, Machine Learning Dept, Abu Dhabi, U Arab Emirates
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 19期
基金
中国国家自然科学基金;
关键词
Noise; Privacy; Computational modeling; Data models; Differential privacy; Accuracy; Internet of Things; Data splitting; federated learning (FL); Gaussian mechanism; sensitivity and convergence rate;
D O I
10.1109/JIOT.2024.3421991
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) emerges as an attractive collaborative machine learning framework that enables training of models across decentralized devices by merely exposing model parameters. However, malicious attackers can still hijack communicated parameters to expose clients' raw samples resulting in privacy leakage. To defend against such attacks, differentially private FL (DPFL) is devised, which incurs negligible computation overhead in protecting privacy by adding noises. Nevertheless, the low model utility and communication efficiency makes DPFL hard to be deployed in the real environment. To overcome these deficiencies, we propose a novel DPFL algorithm called FedDP-SA (namely, federated learning with differential privacy by splitting Local data sets and averaging parameters). Specifically, FedDP-SA splits a local data set into multiple subsets for parameter updating. Then, parameters averaged over all subsets plus differential privacy (DP) noises are returned to the parameter server. FedDP-SA offers dual benefits: 1) enhancing model accuracy by efficiently lowering sensitivity, thereby reducing noise to ensure DP and 2) improving communication efficiency by communicating model parameters with a lower frequency. These advantages are validated through sensitivity analysis and convergence rate analysis. Finally, we conduct comprehensive experiments to verify the performance of FedDP-SA compared with other state-of-the-art baseline algorithms.
引用
收藏
页码:31687 / 31698
页数:12
相关论文
共 26 条
  • [1] A Federated Learning Framework Based on Differentially Private Continuous Data Release
    Cai, Jianping
    Liu, Ximeng
    Ye, Qingqing
    Liu, Yang
    Wang, Yuyang
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 4879 - 4894
  • [2] Boosting Accuracy of Differentially Private Continuous Data Release for Federated Learning
    Cai, Jianping
    Ye, Qingqing
    Hu, Haibo
    Liu, Ximeng
    Fu, Yanggeng
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 10287 - 10301
  • [3] Exploring the Practicality of Differentially Private Federated Learning: A Local Iteration Tuning Approach
    Zhou, Yipeng
    Wang, Runze
    Liu, Jiahao
    Wu, Di
    Yu, Shui
    Wen, Yonggang
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (04) : 3280 - 3294
  • [4] PriFairFed: A Local Differentially Private Federated Learning Algorithm for Client-Level Fairness
    Hu, Chuang
    Wu, Nanxi
    Shi, Siping
    Liu, Xuan
    Luo, Bing
    Wang, Kanye Ye
    Jiang, Jiawei
    Cheng, Dazhao
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (05) : 3993 - 4005
  • [5] Utility-Aware Optimal Data Selection for Differentially Private Federated Learning in IoV
    Zhang, Jiancong
    Li, Shining
    Wang, Changhao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (20): : 33326 - 33336
  • [6] Boosting Accuracy of Differentially Private Federated Learning in Industrial IoT With Sparse Responses
    Cui, Laizhong
    Ma, Jiating
    Zhou, Yipeng
    Yu, Shui
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (01) : 910 - 920
  • [7] Differentially Private Federated Learning via Reconfigurable Intelligent Surface
    Yang, Yuhan
    Zhou, Yong
    Wu, Youlong
    Shi, Yuanming
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (20) : 19728 - 19743
  • [8] Differentially Private Federated Learning on Non-iid Data: Convergence Analysis and Adaptive Optimization
    Chen, Lin
    Ding, Xiaofeng
    Bao, Zhifeng
    Zhou, Pan
    Jin, Hai
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (09) : 4567 - 4581
  • [9] Differentially private federated learning with local momentum updates and gradients filtering
    Zhang, Shuaishuai
    Huang, Jie
    Li, Peihao
    Liang, Chuang
    INFORMATION SCIENCES, 2024, 680
  • [10] Energy Efficient and Differentially Private Federated Learning via a Piggyback Approach
    Chen, Rui
    Huang, Chenpei
    Qin, Xiaoqi
    Ma, Nan
    Pan, Miao
    Shen, Xuemin
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (04) : 2698 - 2711