Unexpected Information Leakage of Differential Privacy Due to the Linear Property of Queries

被引:5
作者
Huang, Wen [1 ]
Zhou, Shijie [1 ]
Liao, Yongjian [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Informat & Software Engn, Chengdu 610054, Peoples R China
关键词
Privacy; Differential privacy; Sensitivity; Correlation; Testing; National Institutes of Health; Switches; Laplace mechanism; membership inference attacks; differential privacy; linear property;
D O I
10.1109/TIFS.2021.3075843
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Differential privacy is a widely accepted concept of privacy preservation, and the Laplace mechanism is a famous instance of differentially private mechanisms used to deal with numerical data. In this paper, we find that differential privacy does not take the linear property of queries into account, resulting in unexpected information leakage. Specifically, the linear property makes it possible to divide one query into two queries, such as q(D) = q(D-1)+ q(D-2) if D = D-1 boolean OR D-2 and D-1 boolean OR D-2 = phi. If attackers try to obtain an answer to q(D), they can not only issue the query q(D) but also issue q(D-1) and calculate q(D-2) by themselves as long as they know D-2. Through different divisions of one query, attackers can obtain multiple different answers to the same query from differentially private mechanisms. However, from the attackers' perspective and differentially private mechanisms' perspective, the total consumed privacy budget is different if divisions are delicately designed. This difference leads to unexpected information leakage because the privacy budget is the key parameter for controlling the amount of information that is legally released from differentially private mechanisms. To demonstrate unexpected information leakage, we present a membership inference attack against the Laplace mechanism. Specifically, under the constraints of differential privacy, we propose a method for obtaining multiple independent identically distributed samples of answers to queries that satisfy the linear property. The proposed method is based on a linear property and some background knowledge of the attackers. When the background knowledge is sufficient, the proposed method can obtain a sufficient number of samples from differentially private mechanisms such that the total consumed privacy budget can be made unreasonably large. Based on the obtained samples, a hypothesis testing method is used to determine whether a target record is in a target dataset.
引用
收藏
页码:3123 / 3137
页数:15
相关论文
共 50 条
  • [31] Transfer learning for linear regression with differential privacy
    Hou, Yiming
    Song, Yunquan
    Wang, Zhijian
    COMPLEX & INTELLIGENT SYSTEMS, 2025, 11 (01)
  • [32] Asking the Proper Question: Adjusting Queries to Statistical Procedures Under Differential Privacy
    Shoham, Tomer
    Rinott, Yosef
    PRIVACY IN STATISTICAL DATABASES, PSD 2022, 2022, 13463 : 46 - 61
  • [33] Information-Theoretic Approaches to Differential Privacy
    Unsal, Ayse
    Onen, Melek
    ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [34] On the Relation Between Identifiability, Differential Privacy, and Mutual-Information Privacy
    Wang, Weina
    Ying, Lei
    Zhang, Junshan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (09) : 5018 - 5029
  • [35] Quantum Differential Privacy: An Information Theory Perspective
    Hirche, Christoph
    Rouze, Cambyse
    Franca, Daniel Stilck
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (09) : 5771 - 5787
  • [36] Evaluating Differential Privacy on Correlated Datasets Using Pointwise Maximal Leakage
    Saeidian, Sara
    Oechtering, Tobias J.
    Skoglund, Mikael
    PRIVACY TECHNOLOGIES AND POLICY, APF 2024, 2024, 14831 : 73 - 86
  • [37] Information entropy differential privacy: A differential privacy protection data method based on rough set theory
    Li, Xianxian
    Luo, Chunfeng
    Liu, Peng
    Wang, Li-E
    IEEE 17TH INT CONF ON DEPENDABLE, AUTONOM AND SECURE COMP / IEEE 17TH INT CONF ON PERVAS INTELLIGENCE AND COMP / IEEE 5TH INT CONF ON CLOUD AND BIG DATA COMP / IEEE 4TH CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2019, : 918 - 923
  • [38] ANSWERING n2+o(1) COUNTING QUERIES WITH DIFFERENTIAL PRIVACY IS HARD
    Ullman, Jonathan
    SIAM JOURNAL ON COMPUTING, 2016, 45 (02) : 473 - 496
  • [39] Answering n∧{2+o(1)} Counting Queries with Differential Privacy is Hard
    Ullman, Jonathan
    STOC'13: PROCEEDINGS OF THE 2013 ACM SYMPOSIUM ON THEORY OF COMPUTING, 2013, : 361 - 370
  • [40] Local dampening: differential privacy for non-numeric queries via local sensitivity
    Farias, Victor A. E.
    Brito, Felipe T.
    Flynn, Cheryl
    Machado, Javam C.
    Majumdar, Subhabrata
    Srivastava, Divesh
    VLDB JOURNAL, 2023, 32 (06) : 1191 - 1214