Data Level Privacy Preserving: A Stochastic Perturbation Approach Based on Differential Privacy

被引:16
作者
Ma, Chuan [1 ,2 ]
Yuan, Long [3 ]
Han, Li [4 ]
Ding, Ming [5 ]
Bhaskar, Raghav [5 ]
Li, Jun [1 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Elect & Opt Engn, Nanjing 210094, Jiangsu, Peoples R China
[2] Southeast Univ, Key Lab Comp Network & Informat Integrat, Minist Educ, Nanjing 211189, Peoples R China
[3] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Jiangsu, Peoples R China
[4] East China Normal Univ, Software Engn Inst, Shanghai 200050, Peoples R China
[5] CSIRO, Data61, Sydney, NSW 1710, Australia
基金
中国国家自然科学基金;
关键词
Differential privacy; stochastic perturbation; tabular dataset; ENERGY-EFFICIENT;
D O I
10.1109/TKDE.2021.3137047
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the great amount of available data, especially collecting from the ubiquitous Internet of Things (IoT), the issue of privacy leakage arises increasingly concerns recently. To preserve the privacy of IoT datasets, traditional methods usually calibrate random noises on the data values to achieve differential privacy (DP). However, the amount of the calibrating noises should be carefully designed and a heedless value will definitely degrade the availability of datasets. Thus, in this work, we propose a stochastic perturbation method to sanitize the dataset, where the perturbation is obtained from the rest samples in the same dataset. In addition, we derive the expression of the utility level based on its unique framework and prove that the proposed algorithm can achieve the $\epsilon$e-DP. To show the effectiveness of the proposed algorithm, we conduct extensive experiments on real-life datasets by various functions, such as query answers and machine learning tasks. By comparing with the state-of-the-art methods, our proposed algorithm can achieve a better performance under the same privacy level.
引用
收藏
页码:3619 / 3631
页数:13
相关论文
共 47 条
[1]  
Acharya J, 2020, PR MACH LEARN RES, V119
[2]  
[Anonymous], 2017, IEEE INT C COMM
[3]   PROCHLO: Strong Privacy for Analytics in the Crowd [J].
Bittau, Andrea ;
Erlingsson, Ulfar ;
Maniatis, Petros ;
Mironov, Ilya ;
Raghunathan, Ananth ;
Lie, David ;
Rudominer, Mitch ;
Kode, Ushasree ;
Tinnes, Julien ;
Seefeld, Bernhard .
PROCEEDINGS OF THE TWENTY-SIXTH ACM SYMPOSIUM ON OPERATING SYSTEMS PRINCIPLES (SOSP '17), 2017, :441-459
[4]   Answering Range Queries Under Local Differential Privacy [J].
Cormode, Graham ;
Kulkarni, Tejas ;
Srivastava, Divesh .
PROCEEDINGS OF THE VLDB ENDOWMENT, 2019, 12 (10) :1126-1138
[5]   Publishing Graph Degree Distribution with Node Differential Privacy [J].
Day, Wei-Yen ;
Li, Ninghui ;
Lyu, Min .
SIGMOD'16: PROCEEDINGS OF THE 2016 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2016, :123-138
[6]   Unique in the shopping mall: On the reidentifiability of credit card metadata [J].
de Montjoye, Yves-Alexandre ;
Radaelli, Laura ;
Singh, Vivek Kumar ;
Pentland, Alex Sandy .
SCIENCE, 2015, 347 (6221) :536-539
[7]  
Dheeru E. Karra, 2017, UCI machine learning repository
[8]   Minimax Optimal Procedures for Locally Private Estimation [J].
Duchi, John C. ;
Jordan, Michael I. ;
Wainwright, Martin J. .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2018, 113 (521) :182-201
[9]   Privacy Aware Learning [J].
Duchi, John C. ;
Jordan, Michael I. ;
Wainwright, Martin J. .
JOURNAL OF THE ACM, 2014, 61 (06) :1-57
[10]   Local Privacy and Statistical Minimax Rates [J].
Duchi, John C. ;
Jordan, Michael I. ;
Wainwright, Martin J. .
2013 IEEE 54TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2013, :429-438