Differential Privacy for Class-Based Data: A Practical Gaussian Mechanism

被引:2
作者
Ramakrishna, Raksha [1 ]
Scaglione, Anna [2 ]
Wu, Tong [2 ]
Ravi, Nikhil [2 ]
Peisert, Sean [3 ]
机构
[1] KTH Royal Inst Technol, Sch Elect Engn & Comp Sci, Div Network & Syst Engn, S-11428 Stockholm, Sweden
[2] Cornell Tech, Dept Elect & Comp Engn, New York, NY 10044 USA
[3] Lawrence Berkeley Natl Lab, Comp Sci Res, Berkeley, CA 94720 USA
关键词
Differential privacy; class-based privacy; Gaussian mechanism; autoregression and moving average; smart meter data;
D O I
10.1109/TIFS.2023.3289128
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this paper, we present a notion of differential privacy (DP) for data that comes from different classes. Here, the class-membership is private information that needs to be protected. The proposed method is an output perturbation mechanism that adds noise to the release of query response such that the analyst is unable to infer the underlying class-label. The proposed DP method is capable of not only protecting the privacy of class-based data but also meets quality metrics of accuracy and is computationally efficient and practical. We illustrate the efficacy of the proposed method empirically while outperforming the baseline additive Gaussian noise mechanism. We also examine a real-world application and apply the proposed DP method to the autoregression and moving average (ARMA) forecasting method, protecting the privacy of the underlying data source. Case studies on the real-world advanced metering infrastructure (AMI) measurements of household power consumption validate the excellent performance of the proposed DP method while also satisfying the accuracy of forecasted power consumption measurements.
引用
收藏
页码:5096 / 5108
页数:13
相关论文
共 34 条
[11]  
Xiao Y., Xiong L., Zhang S., Cao Y., LocLok: Location cloaking with differential privacy via hidden Markov model, Proc. VLDB Endowment, 10, 12, pp. 1901-1904, (2017)
[12]  
Song S., Wang Y., Chaudhuri K., Pufferfish privacy mechanisms for correlated data, Proc. ACM Int. Conf. Manag. Data, pp. 1291-1306, (2017)
[13]  
Fioretto F., Van Hentenryck P., OptStream: Releasing time series privately, J. Artif. Intell. Res., 65, pp. 423-456, (2019)
[14]  
Le Ny J., Differential Privacy for Dynamic Data, (2020)
[15]  
Sheffet O., Old techniques in differentially private linear regression, Proc. Algorithmic Learn. Theory, pp. 789-827, (2019)
[16]  
Bernstein G., Sheldon D.R., Differentially private Bayesian linear regression, Proc. Adv. Neural Inf. Process. Syst., 32, pp. 525-535, (2019)
[17]  
Honkela A., Melkas L., Gaussian processes with differential privacy, (2021)
[18]  
Currie R., Peisert S., Scaglione A., Shumavon A., Ravi N., Data privacy for the grid: Toward a data privacy standard for inverter-based and distributed energy resources, IEEE Power Energy Mag., 21, 5, (2023)
[19]  
De Martini P., Et al., Modern distribution grid project: Strategy and implementation planning guidebook, 4, (2020)
[20]  
Liu J., Xiao Y., Li S., Liang W., Chen C.L.P., Cyber security and privacy issues in smart grids, IEEE Commun. Surveys Tuts., 14, 4, pp. 981-997, (2012)