Toward the Application of Differential Privacy to Data Collaboration

被引:0
作者
Yamashiro, Hiromi [1 ]
Omote, Kazumasa [2 ]
Imakura, Akira [2 ]
Sakurai, Tetsuya [2 ]
机构
[1] Univ Tsukuba, Grad Sch Sci & Technol, Tsukuba 3058577, Japan
[2] Univ Tsukuba, Inst Syst & Informat Engn, Tsukuba 3058577, Japan
关键词
Differential privacy; dimension reduction; distributed machine learning; federated learning; principal component analysis;
D O I
10.1109/ACCESS.2024.3396146
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning, a model-sharing method, and Data Collaboration, a non-model-sharing method, are recognized as data analysis methods for distributed data. In Federated Learning, clients send only the parameters of a machine learning model to the central server. In Data Collaboration, clients send data that has undergone irreversibly transformed through dimensionality reduction to the central server. Both methods are designed with privacy concerns, but privacy is not guaranteed. Differential Privacy, a theoretical and quantitative privacy criterion, has been applied to Federated Learning to achieve rigorous privacy preservation. In this paper, we introduce a novel method using PCA (Principal Component Analysis) that finds low-rank approximation of a matrix preserving the variance, aiming to apply Differential Privacy to Data Collaboration. Experimental evaluation using the proposed method show that differentially-private Data Collaboration achieves comparable performance to differentially-private Federated Learning.
引用
收藏
页码:63292 / 63301
页数:10
相关论文
共 27 条
  • [1] Deep Learning with Differential Privacy
    Abadi, Martin
    Chu, Andy
    Goodfellow, Ian
    McMahan, H. Brendan
    Mironov, Ilya
    Talwar, Kunal
    Zhang, Li
    [J]. CCS'16: PROCEEDINGS OF THE 2016 ACM SIGSAC CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2016, : 308 - 318
  • [2] Balle B, 2018, PR MACH LEARN RES, V80
  • [3] Bogdanova A, 2022, Arxiv, DOI arXiv:2212.03373
  • [4] Bogdanova A, 2020, Arxiv, DOI arXiv:2011.06803
  • [5] A Privacy Preserving Scheme with Dimensionality Reduction for Distributed Machine Learning
    Chen, Zhaoheng
    Omote, Kazumasa
    [J]. 2021 16TH ASIA JOINT CONFERENCE ON INFORMATION SECURITY (ASIAJCIS 2021), 2021, : 45 - 50
  • [6] Deng L., 2012, IEEE Signal Process. Mag., V29, P141, DOI DOI 10.1109/MSP.2012.2211477
  • [7] Local Privacy and Statistical Minimax Rates
    Duchi, John C.
    Jordan, Michael I.
    Wainwright, Martin J.
    [J]. 2013 IEEE 54TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2013, : 429 - 438
  • [8] Calibrating noise to sensitivity in private data analysis
    Dwork, Cynthia
    McSherry, Frank
    Nissim, Kobbi
    Smith, Adam
    [J]. THEORY OF CRYPTOGRAPHY, PROCEEDINGS, 2006, 3876 : 265 - 284
  • [9] Analyze Gauss: Optimal Bounds for Privacy-Preserving Principal Component Analysis
    Dwork, Cynthia
    Talwar, Kunal
    Thakurta, Abhradeep
    Zhang, Li
    [J]. STOC'14: PROCEEDINGS OF THE 46TH ANNUAL 2014 ACM SYMPOSIUM ON THEORY OF COMPUTING, 2014, : 11 - 20
  • [10] The Algorithmic Foundations of Differential Privacy
    Dwork, Cynthia
    Roth, Aaron
    [J]. FOUNDATIONS AND TRENDS IN THEORETICAL COMPUTER SCIENCE, 2013, 9 (3-4): : 211 - 406