Customized privacy preserving for inherent data and latent data

被引:23
作者
He, Zaobo [1 ]
Cai, Zhipeng [1 ]
Sun, Yunchuan [2 ]
Li, Yingshu [1 ]
Cheng, Xiuzhen [3 ]
机构
[1] Georgia State Univ, Dept Comp Sci, Atlanta, GA 30303 USA
[2] Beijing Normal Univ, Sch Business, Beijing 100875, Peoples R China
[3] George Washington Univ, Dept Comp Sci, Washington, DC 20052 USA
基金
美国国家科学基金会;
关键词
Inherent data privacy; Latent data privacy; Data-sanitization; Differential privacy; Optimized tradeoff;
D O I
10.1007/s00779-016-0972-2
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The huge amount of sensory data collected from mobile devices has offered great potentials to promote more significant services based on user data extracted from sensor readings. However, releasing user data could also seriously threaten user privacy. It is possible to directly collect sensitive information from released user data without user permissions. Furthermore, third party users can also infer sensitive information contained in released data in a latent manner by utilizing data mining techniques. In this paper, we formally define these two types of threats as inherent data privacy and latent data privacy and construct a data-sanitization strategy that can optimize the tradeoff between data utility and customized two types of privacy. The key novel idea lies that the developed strategy can combat against powerful third party users with broad knowledge about users and launching optimal inference attacks. We show that our strategy does not reduce the benefit brought by user data much, while sensitive information can still be protected. To the best of our knowledge, this is the first work that preserves both inherent data privacy and latent data privacy.
引用
收藏
页码:43 / 54
页数:12
相关论文
共 31 条
[1]   Chiaroscuro: Transparency and Privacy for Massive Personal Time-Series Clustering [J].
Allard, Tristan ;
Hebrail, Georges ;
Masseglia, Florent ;
Pacitti, Esther .
SIGMOD'15: PROCEEDINGS OF THE 2015 ACM SIGMOD INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2015, :779-794
[2]  
[Anonymous], 2006, NEW YORK TIMES
[3]  
[Anonymous], 2016, P 1 ACM WORKSHOP PRI
[4]  
[Anonymous], 2012, P 2012 ACM C COMP CO, DOI DOI 10.1145/2382196.2382261
[5]   Optimal Geo-Indistinguishable Mechanisms for Location Privacy [J].
Bordenabe, Nicolas E. ;
Chatzikokolakis, Konstantinos ;
Palamidessi, Catuscia .
CCS'14: PROCEEDINGS OF THE 21ST ACM CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, 2014, :251-262
[6]   Collective Data-Sanitization for Preventing Sensitive Information Inference Attacks in Social Networks [J].
Cai, Zhipeng ;
He, Zaobo ;
Guan, Xin ;
Li, Yingshu .
IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2018, 15 (04) :577-590
[7]  
Chaabane A., 2012, NDSS
[8]   Calibrating noise to sensitivity in private data analysis [J].
Dwork, Cynthia ;
McSherry, Frank ;
Nissim, Kobbi ;
Smith, Adam .
THEORY OF CRYPTOGRAPHY, PROCEEDINGS, 2006, 3876 :265-284
[9]  
Enck W., 2010, Communications of the ACM, V10, P1, DOI DOI 10.1145/2494522
[10]   Privacy-Preserving Data Publishing: A Survey of Recent Developments [J].
Fung, Benjamin C. M. ;
Wang, Ke ;
Chen, Rui ;
Yu, Philip S. .
ACM COMPUTING SURVEYS, 2010, 42 (04)