Differential Privacy for Class-Based Data: A Practical Gaussian Mechanism

被引:2
作者
Ramakrishna, Raksha [1 ]
Scaglione, Anna [2 ]
Wu, Tong [2 ]
Ravi, Nikhil [2 ]
Peisert, Sean [3 ]
机构
[1] KTH Royal Inst Technol, Sch Elect Engn & Comp Sci, Div Network & Syst Engn, S-11428 Stockholm, Sweden
[2] Cornell Tech, Dept Elect & Comp Engn, New York, NY 10044 USA
[3] Lawrence Berkeley Natl Lab, Comp Sci Res, Berkeley, CA 94720 USA
关键词
Differential privacy; class-based privacy; Gaussian mechanism; autoregression and moving average; smart meter data;
D O I
10.1109/TIFS.2023.3289128
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this paper, we present a notion of differential privacy (DP) for data that comes from different classes. Here, the class-membership is private information that needs to be protected. The proposed method is an output perturbation mechanism that adds noise to the release of query response such that the analyst is unable to infer the underlying class-label. The proposed DP method is capable of not only protecting the privacy of class-based data but also meets quality metrics of accuracy and is computationally efficient and practical. We illustrate the efficacy of the proposed method empirically while outperforming the baseline additive Gaussian noise mechanism. We also examine a real-world application and apply the proposed DP method to the autoregression and moving average (ARMA) forecasting method, protecting the privacy of the underlying data source. Case studies on the real-world advanced metering infrastructure (AMI) measurements of household power consumption validate the excellent performance of the proposed DP method while also satisfying the accuracy of forecasted power consumption measurements.
引用
收藏
页码:5096 / 5108
页数:13
相关论文
共 34 条
[1]  
Dwork C., Differential privacy: A survey of results, Theory and Applications of Models of Computation, pp. 1-19, (2008)
[2]  
Dwork C., Roth A., The algorithmic foundations of differential privacy, Found. Trends Theor. Comput. Sci., 9, 3-4, pp. 211-407, (2014)
[3]  
Sarwate A.D., Chaudhuri K., Signal processing and machine learning with differential privacy: Algorithms and challenges for continuous data, IEEE Signal Process. Mag., 30, 5, pp. 86-94, (2013)
[4]  
McSherry F., Talwar K., Mechanism design via differential privacy, Proc. 48th Annu. IEEE Symp. Found. Comput. Sci. (FOCS), pp. 94-103, (2007)
[5]  
Nissim K., Smorodinsky R., Tennenholtz M., Approximately optimal mechanism design via differential privacy, Proc. 3rd Innov. Theor. Comput. Sci. Conf., pp. 203-213, (2012)
[6]  
Balle B., Wang Y.-X., Improving the Gaussian mechanism for differential privacy: Analytical calibration and optimal denoising, Proc. Int. Conf. Mach. Learn., pp. 394-403, (2018)
[7]  
Kifer D., MacHanavajjhala A., Pufferfish: A framework for mathematical privacy definitions, ACM Trans. Database Syst., 39, 1, pp. 1-36, (2014)
[8]  
He X., MacHanavajjhala A., Ding B., Blowfish privacy: Tuning privacy-utility trade-offs using policies, Proc. ACM SIGMOD Int. Conf. Manage. Data, pp. 1447-1458, (2014)
[9]  
Xiao Y., Xiong L., Protecting locations with differential privacy under temporal correlations, Proc. 22nd ACM SIGSAC Conf. Comput. Commun. Secur., pp. 1298-1309, (2015)
[10]  
Xiao Y., Shen Y., Liu J., Xiong L., Jin H., Xu X., DPHMM: Customizable data release with differential privacy via hidden Markov model, (2016)