DISTRIBUTED ESTIMATION OF PRINCIPAL EIGENSPACES

被引:125
作者
Fan, Jianqing [1 ]
Wang, Dong [1 ]
Wang, Kaizheng [1 ]
Zhu, Ziwei [2 ]
机构
[1] Princeton Univ, Dept Operat Res & Financial Engn, Princeton, NJ 08544 USA
[2] Univ Michigan, Dept Stat, Ann Arbor, MI 48109 USA
关键词
Distributed learning; PCA; one-shot approach; communication efficiency; unbiasedness of empirical eigenspaces; heterogeneity; LARGEST EIGENVALUE; HIGH DIMENSION; SPARSE PCA; PERTURBATION; ASYMPTOTICS; MATRIX; CONSISTENCY; EIGENSTRUCTURE; COMPLEX; BOUNDS;
D O I
10.1214/18-AOS1713
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Principal component analysis (PCA) is fundamental to statistical machine learning. It extracts latent principal factors that contribute to the most variation of the data. When data are stored across multiple machines, however, communication cost can prohibit the computation of PCA in a central location and distributed algorithms for PCA are thus needed. This paper proposes and studies a distributed PCA algorithm: each node machine computes the top K eigenvectors and transmits them to the central server; the central server then aggregates the information from all the node machines and conducts a PCA based on the aggregated information. We investigate the bias and variance for the resulting distributed estimator of the top K eigenvectors. In particular, we show that for distributions with symmetric innovation, the empirical top eigenspaces are unbiased, and hence the distributed PCA is "unbiased." We derive the rate of convergence for distributed PCA estimators, which depends explicitly on the effective rank of covariance, eigengap, and the number of machines. We show that when the number of machines is not unreasonably large, the distributed PCA performs as well as the whole sample PCA, even without full access of whole data. The theoretical results are verified by an extensive simulation study. We also extend our analysis to the heterogeneous case where the population covariance matrices are different across local machines but share similar top eigenstructures.
引用
收藏
页码:3009 / 3031
页数:23
相关论文
共 50 条
[1]   ASYMPTOTIC THEORY FOR PRINCIPAL COMPONENT ANALYSIS [J].
ANDERSON, TW .
ANNALS OF MATHEMATICAL STATISTICS, 1963, 34 (01) :122-&
[2]  
[Anonymous], 2016, PREPRINT
[3]  
[Anonymous], 2016, PREPRINT
[4]  
[Anonymous], DISTRIBUTED ESTIMA S
[5]  
[Anonymous], 2017, PREPRINT
[6]  
[Anonymous], IEEE INT C DAT MIN I
[7]  
[Anonymous], 2012, J HOPKINS STUDIES MA
[8]  
[Anonymous], ADV NEURAL INFORM PR
[9]  
[Anonymous], PREPRINT
[10]  
[Anonymous], PREPRINT