A Programming Language for Data Privacy with Accuracy Estimations

被引:3
作者
Lobo-Vesga, Elisabet [1 ]
Russo, Alejandro [1 ]
Gaboardi, Marco [2 ]
机构
[1] Chalmers Univ Technol, Gothenburg, Sweden
[2] Boston Univ, Boston, MA 02215 USA
来源
ACM TRANSACTIONS ON PROGRAMMING LANGUAGES AND SYSTEMS | 2021年 / 43卷 / 02期
基金
美国国家科学基金会;
关键词
Accuracy; concentration bounds; differential privacy; databases; Haskell; DIFFERENTIAL PRIVACY; SENSITIVITY;
D O I
10.1145/3452096
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Differential privacy offers a formal framework for reasoning about the privacy and accuracy of computations on private data. It also offers a rich set of building blocks for constructing private data analyses. When carefully calibrated, these analyses simultaneously guarantee the privacy of the individuals contributing their data, and the accuracy of the data analysis results, inferring useful properties about the population. The compositional nature of differential privacy has motivated the design and implementation of several programming languages to ease the implementation of differentially private analyses. Even though these programming languages provide support for reasoning about privacy, most of them disregard reasoning about the accuracy of data analyses. To overcome this limitation, we present DPella, a programming framework providing data analysts with support for reasoning about privacy, accuracy, and their trade-offs. The distinguishing feature of DPella is a novel component that statically tracks the accuracy of different data analyses. To provide tight accuracy estimations, this component leverages taint analysis for automatically inferring statistical independence of the different noise quantities added for guaranteeing privacy. We evaluate our approach by implementing several classical queries from the literature and showing how data analysts can calibrate the privacy parameters to meet the accuracy requirements, and vice versa.
引用
收藏
页数:42
相关论文
共 61 条
[1]   Synthesizing Coupling Proofs of Differential Privacy [J].
Albarghouthi, Aws ;
Hsu, Justin .
PROCEEDINGS OF THE ACM ON PROGRAMMING LANGUAGES-PACMPL, 2018, 2 (POPL)
[2]  
Balle B, 2018, PR MACH LEARN RES, V80
[3]   Deciding Accuracy of Differential Privacy Schemes [J].
Barthe, Gilles ;
Chadha, Rohit ;
Krogmeier, Paul ;
Sistla, A. Prasad ;
Viswanathan, Mahesh .
PROCEEDINGS OF THE ACM ON PROGRAMMING LANGUAGES-PACMPL, 2021, 5 (POPL)
[4]  
Barthe G, 2015, ACM SIGPLAN NOTICES, V50, P55, DOI [10.1145/10.1145/2676726.2677000, 10.1145/2775051.2677000]
[5]   Proving differential privacy in Hoare logic [J].
Barthe, Gilles ;
Gaboardi, Marco ;
Arias, Emilio Jesus Gallego ;
Hsu, Justin ;
Kunz, Cesar ;
Strub, Pierre-Yves .
2014 IEEE 27TH COMPUTER SECURITY FOUNDATIONS SYMPOSIUM (CSF), 2014, :411-424
[6]  
Barthe Gilles, 2016, P 43 INT C AUT LANG
[7]  
Blocki Jeremiah, 2013, P 4 C INN THEOR COMP
[8]   Composable and Versatile Privacy via Truncated CDP [J].
Bun, Mark ;
Dwork, Cynthia ;
Rothblum, Guy N. ;
Steinke, Thomas .
STOC'18: PROCEEDINGS OF THE 50TH ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2018, :74-86
[9]   Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds [J].
Bun, Mark ;
Steinke, Thomas .
THEORY OF CRYPTOGRAPHY, TCC 2016-B, PT I, 2016, 9985 :635-658
[10]   Private and Continual Release of Statistics [J].
Chan, T. -H. Hubert ;
Shi, Elaine ;
Song, Dawn .
ACM TRANSACTIONS ON INFORMATION AND SYSTEM SECURITY, 2011, 14 (03)