Differentially Private Robust Low-Rank Approximation

被引:0
作者
Arora, Raman [1 ]
Braverman, Vladimir [1 ]
Upadhyay, Jalaj [1 ]
机构
[1] Johns Hopkins Univ, Baltimore, MD 21201 USA
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018) | 2018年 / 31卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we study the following robust low-rank matrix approximation problem: given a matrix A is an element of R-nxd ,find a rank-k matrix M, while satisfying differential privacy, such that parallel to A - M parallel to(p )<= alpha. OPTk(A) + tau, where parallel to B parallel to(p) is the = entry-wise l(p)-norm of B and OPTk(A) := min(rank(X)<= k) parallel to A - X parallel to(p). It is well known that low-rank approximation w.r.t. entrywise l(p)-norm, for p is an element of [1, 2), yields robustness to gross outliers in the data. We propose an algorithm that guarantees alpha = (O) over tilde (k(2)), tau = (O) over tilde (k(2)(n + kd/epsilon), runs in (O) over tilde((n+d)poly k) time and uses O(k (n+d) log k) space. We study extensions to the streaming setting where entries of the matrix arrive in an arbitrary order and output is produced at the very end or continually. We also study the related problem of differentially private robust principal component analysis (PCA), wherein we return a rank-k projection matrix Pi such that parallel to A - A Pi parallel to(p) <= alpha . OPTk(A) + tau.
引用
收藏
页数:9
相关论文
共 33 条
[1]   On spectral learning of mixtures of distributions [J].
Achlioptas, D ;
McSherry, R .
LEARNING THEORY, PROCEEDINGS, 2005, 3559 :458-469
[2]  
[Anonymous], 2016, Wall Street Journal
[3]  
[Anonymous], ARXIV171104740
[4]   The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy [J].
Blocki, Jeremiah ;
Blum, Avrim ;
Datta, Anupam ;
Sheffet, Or .
2012 IEEE 53RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2012, :410-419
[5]   A Learning Theory Approach to Noninteractive Database Privacy [J].
Blum, Avrim ;
Ligett, Katrina ;
Roth, Aaron .
JOURNAL OF THE ACM, 2013, 60 (02)
[6]   A pure L1-norm principal component analysis [J].
Brooks, J. P. ;
Dula, J. H. ;
Boone, E. L. .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2013, 61 :83-98
[7]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[8]   METHOD FOR SIMULATING STABLE RANDOM-VARIABLES [J].
CHAMBERS, JM ;
MALLOWS, CL ;
STUCK, BW .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1976, 71 (354) :340-344
[9]   Dimensionality Reduction for k-Means Clustering and Low Rank Approximation [J].
Cohen, Michael B. ;
Elder, Sam ;
Musco, Cameron ;
Musco, Christopher ;
Persu, Madalina .
STOC'15: PROCEEDINGS OF THE 2015 ACM SYMPOSIUM ON THEORY OF COMPUTING, 2015, :163-172
[10]  
Drineas P., 2002, P 34 ANN ACM S THEOR, P82