Distribution-invariant differential privacy

被引:3
|
作者
Bi, Xuan [1 ]
Shen, Xiaotong [2 ]
机构
[1] Univ Minnesota, Carlson Sch Management, Informat & Decis Sci, Minneapolis, MN 55455 USA
[2] Univ Minnesota, Sch Stat, Minneapolis, MN USA
关键词
Privacy protection; Distribution preservation; Data sharing; Data perturbation; Randomized mechanism; NOISE;
D O I
10.1016/j.jeconom.2022.05.004
中图分类号
F [经济];
学科分类号
02 ;
摘要
Differential privacy is becoming one gold standard for protecting the privacy of publicly shared data. It has been widely used in social science, data science, public health, information technology, and the U.S. decennial census. Nevertheless, to guarantee differential privacy, existing methods may unavoidably alter the conclusion of original data analysis, as privatization often changes the sample distribution. This phenomenon is known as the trade-off between privacy protection and statistical accuracy. In this work, we mitigate this trade-off by developing a distribution-invariant privatization (DIP) method to reconcile both high statistical accuracy and strict differential privacy. As a result, any downstream statistical or machine learning task yields essentially the same conclusion as if one used the original data. Numerically, under the same strictness of privacy protection, DIP achieves superior statistical accuracy in in a wide range of simulation studies and real-world benchmarks.& COPY; 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:444 / 453
页数:10
相关论文
共 50 条
  • [1] Distribution Simulation Under Local Differential Privacy
    Asoodeh, Shahab
    2022 17TH CANADIAN WORKSHOP ON INFORMATION THEORY (CWIT), 2022, : 57 - 61
  • [2] Invariant Post-Random Response Perturbation for Correlated Attributes Under Local Differential Privacy Constraint
    Yang G.-M.
    Zhu H.-M.
    Fang X.-J.
    Su S.-Z.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2019, 47 (05): : 1079 - 1085
  • [3] Privacy-Aware Eye Tracking Using Differential Privacy
    Steil, Julian
    Hagestedt, Inken
    Huang, Michael Xuelin
    Bulling, Andreas
    ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [4] Privacy-preserving collaborative filtering algorithm based on local differential privacy
    Bao, Ting
    Xu, Lei
    Zhu, Liehuang
    Wang, Lihong
    Li, Ruiguang
    Li, Tielei
    CHINA COMMUNICATIONS, 2021, 18 (11) : 42 - 60
  • [5] Differential Privacy for Deep and Federated Learning: A Survey
    El Ouadrhiri, Ahmed
    Abdelhadi, Ahmed
    IEEE ACCESS, 2022, 10 : 22359 - 22380
  • [6] A Survey of Differential Privacy Techniques for Federated Learning
    Wang, Xin
    Li, Jiaqian
    Ding, Xueshuang
    Zhang, Haoji
    Sun, Lianshan
    IEEE ACCESS, 2025, 13 : 6539 - 6555
  • [7] When Differential Privacy Implies Syntactic Privacy
    Ekenstedt, Emelie
    Ong, Lawrence
    Liu, Yucheng
    Johnson, Sarah
    Yeoh, Phee Lep
    Kliewer, Joerg
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2022, 17 : 2110 - 2124
  • [8] Computational Differential Privacy
    Mironov, Ilya
    Pandey, Omkant
    Reingold, Omer
    Vadhan, Salil
    ADVANCES IN CRYPTOLOGY - CRYPTO 2009, 2009, 5677 : 126 - +
  • [9] Differential-Privacy-Based Citizen Privacy Preservation in E-Government Applications
    Shi, Yajuan
    Piao, Chunhui
    Pan, Xiao
    2016 IEEE 13TH INTERNATIONAL CONFERENCE ON E-BUSINESS ENGINEERING (ICEBE), 2016, : 158 - 163
  • [10] Wasserstein Differential Privacy
    Yang, Chengyi
    Qi, Jiayin
    Zhou, Aimin
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16299 - 16307