Frequency Estimation Mechanisms Under (ϵ,δ)-Utility-Optimized Local Differential Privacy

被引:4
作者
Zhang, Yue [1 ]
Zhu, Youwen [1 ]
Zhou, Yuqian [2 ]
Yuan, Jiabin [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Jiangsu, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, Nanjing 211106, Jiangsu, Peoples R China
关键词
Frequency estimation; Differential privacy; Privacy; Data models; Estimation error; Data analysis; Standards; Utility-optimized local differential privacy; data collection; frequency estimation; RECOMMENDATION;
D O I
10.1109/TETC.2023.3238839
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Frequency estimation mechanisms are widely applied in domains such as machine learning and cloud computing, where it is desirable to provide statistical information. As a fundamental operation in these domains, frequency estimation utilizes personal data which contains sensitive information while it is necessary to protect sensitive information from others. Motivated by this, we preserve user's privacy with local differential privacy by obfuscating personal data on the user side. In this paper, we propose frequency estimation mechanisms under utility-optimized local differential privacy (ULDP), which allow the data collector to obtain some non-sensitive values to improve data utility while protecting sensitive values from leaking sensitive information. We propose three frequency estimation mechanisms under (& varepsilon;,delta) -ULDP (uRFM-GRR, uRFM-RAPPOR, uRFM-OLH) to preserve user's sensitive information. Our proposed mechanisms protect sensitive data with the same privacy guarantee and they are suitable for different scenarios. Besides, in theory, we compare the estimation errors of our proposed mechanisms with existing LDP based mechanisms and show that ours are lower than theirs. Finally, we conduct experiments on synthetic and real-world datasets to evaluate the performance of the three mechanisms. The experimental results demonstrate that our proposed mechanisms are better than the existing LDP based solutions over the same privacy level, while uRFM-OLH frequently performs the best.
引用
收藏
页码:316 / 327
页数:12
相关论文
共 33 条
  • [21] Meng XY, 2018, AAAI CONF ARTIF INTE, P3796
  • [22] Differential Privacy Meets Federated Learning Under Communication Constraints
    Mohammadi, Nima
    Bai, Jianan
    Fan, Qiang
    Song, Yifei
    Yi, Yang
    Liu, Lingjia
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (22): : 22204 - 22219
  • [23] Murakami T, 2019, PROCEEDINGS OF THE 28TH USENIX SECURITY SYMPOSIUM, P1877
  • [24] Random Sample Partition: A Distributed Data Model for Big Data Analysis
    Salloum, Salman
    Huan, Joshua Zhexue
    He, Yulin
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2019, 15 (11) : 5846 - 5854
  • [25] Privacy-Preserving Personalized Recommendation: An Instance-based Approach via Differential Privacy
    Shen, Yilin
    Jin, Hongxia
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, : 540 - 549
  • [26] Privacy Enhanced Matrix Factorization for Recommendation with Local Differential Privacy
    Shin, Hyejin
    Kim, Sungwook
    Shin, Junbum
    Xiao, Xiaokui
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2018, 30 (09) : 1770 - 1782
  • [27] Thakurta A. G., 2017, uS Patent, Patent No. [9,705,908, 9705908]
  • [28] Local Differential Privacy for data collection and analysis
    Wang, Teng
    Zhao, Jun
    Hu, Zhi
    Yang, Xinyu
    Ren, Xuebin
    Lam, Kwok-Yan
    [J]. NEUROCOMPUTING, 2021, 426 : 114 - 133
  • [29] Wang TH, 2017, PROCEEDINGS OF THE 26TH USENIX SECURITY SYMPOSIUM (USENIX SECURITY '17), P729
  • [30] Differential Privacy via Wavelet Transforms
    Xiao, Xiaokui
    Wang, Guozhang
    Gehrke, Johannes
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2011, 23 (08) : 1200 - 1214