Individual Differential Privacy: A Utility-Preserving Formulation of Differential Privacy Guarantees

被引:107
作者
Soria-Comas, Jordi [1 ]
Domingo-Ferrer, Josep [1 ]
Sanchez, David [1 ]
Megias, David [2 ]
机构
[1] Univ Rovira & Virgili, Dept Comp Engn & Math, UNESCO Chair Data Privacy, E-43007 Tarragona, Catalonia, Spain
[2] Univ Oberta Catalunya, Internet Interdisciplinary Inst, Estudis Informat Multimedia & Telecomunicacio, E-08860 Castelldefels, Catalonia, Spain
基金
英国工程与自然科学研究理事会; 欧盟地平线“2020”;
关键词
Data privacy; data utility; differential privacy; NOISE;
D O I
10.1109/TIFS.2017.2663337
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Differential privacy is a popular privacy model within the research community because of the strong privacy guarantee it offers, namely that the presence or absence of any individual in a data set does not significantly influence the results of analyses on the data set. However, enforcing this strict guarantee in practice significantly distorts data and/or limits data uses, thus diminishing the analytical utility of the differentially private results. In an attempt to address this shortcoming, several relaxations of differential privacy have been proposed that trade off privacy guarantees for improved data utility. In this paper, we argue that the standard formalization of differential privacy is stricter than required by the intuitive privacy guarantee it seeks. In particular, the standard formalization requires indistinguishability of results between any pair of neighbor data sets, while indistinguishability between the actual data set and its neighbor data sets should be enough. This limits the data controller's ability to adjust the level of protection to the actual data, hence resulting in significant accuracy loss. In this respect, we propose individual differential privacy, an alternative differential privacy notion that offers the same privacy guarantees as standard differential privacy to individuals (even though not to groups of individuals). This new notion allows the data controller to adjust the distortion to the actual data set, which results in less distortion and more analytical accuracy. We propose several mechanisms to attain individual differential privacy and we compare the new notion against standard differential privacy in terms of the accuracy of the analytical results.
引用
收藏
页码:1418 / 1429
页数:12
相关论文
共 28 条
  • [1] Bambauer Jane, 2014, Vanderbilt Journal of Entertainment and Technology Law, V16, P701
  • [2] Domingo-Ferrer Josep., 2016, SYNTHESIS LECT INFOR, DOI [10.1007/978-3-031-02347-7, DOI 10.1007/978-3-031-02347-7]
  • [3] Dwork C., 2016, Concentrated differential privacy
  • [4] Dwork C, 2006, LECT NOTES COMPUT SC, V4052, P1
  • [5] Dwork C, 2006, LECT NOTES COMPUT SC, V4004, P486
  • [6] Calibrating noise to sensitivity in private data analysis
    Dwork, Cynthia
    McSherry, Frank
    Nissim, Kobbi
    Smith, Adam
    [J]. THEORY OF CRYPTOGRAPHY, PROCEEDINGS, 2006, 3876 : 265 - 284
  • [7] Fredrikson M, 2014, PROCEEDINGS OF THE 23RD USENIX SECURITY SYMPOSIUM, P17
  • [8] Friedman A., 2010, P 16 ACM SIGKDD INT
  • [9] Hundepool A., 2012, Statistical Disclosure Control
  • [10] A discrete analogue of the Laplace distribution
    Inusah, S
    Kozubowski, TJ
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2006, 136 (03) : 1090 - 1102