Communication-Efficient Distributed Eigenspace Estimation

被引:5
作者
Charisopoulos, Vasileios [1 ]
Benson, Austin R. [2 ]
Damle, Anil [2 ]
机构
[1] Cornell Univ, Dept Operat Res & Informat Engn, Ithaca, NY 14853 USA
[2] Cornell Univ, Dept Comp Sci, Ithaca, NY 14853 USA
来源
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE | 2021年 / 3卷 / 04期
关键词
distributed computing; spectral methods; nonconvex optimization; principal component analysis; statistics;
D O I
10.1137/20M1364862
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Distributed computing is a standard way to scale up machine learning and data science algorithms to process large amounts of data. In such settings, avoiding communication amongst machines is paramount for achieving high performance. Rather than distribute the computation of existing algorithms, a common practice for avoiding communication is to compute local solutions or parameter estimates on each machine and then combine the results; in many convex optimization problems, even simple averaging of local solutions can work well. However, these schemes do not work when the local solutions are not unique. Spectral methods are a collection of such problems, where solutions are orthonormal bases of the leading invariant subspace of an associated data matrix. These solutions are only unique up to rotation and reflections. Here, we develop a communication-efficient distributed algorithm for computing the leading invariant subspace of a data matrix. Our algorithm uses a novel alignment scheme that minimizes the Procrustean distance between local solutions and a reference solution and only requires a single round of communication. For the important case of principal component analysis (PCA), we show that our algorithm achieves a similar error rate to that of a centralized estimator. We present numerical experiments demonstrating the efficacy of our proposed algorithm for distributed PCA as well as other problems where solutions exhibit rotational symmetry, such as node embeddings for graph data and spectral initialization for quadratic sensing.
引用
收藏
页码:1067 / 1092
页数:26
相关论文
共 59 条
  • [1] Abadi Martin, 2016, arXiv
  • [2] Arcas BAY, 2018, IEEE INT CONF BIG DA, P1, DOI 10.1109/BigData.2018.8622078
  • [3] Allen-Zhu Z, 2017, PR MACH LEARN RES, V70
  • [4] First Efficient Convergence for Streaming k-PCA: a Global, Gap-Free, and Near-Optimal Rate
    Allen-Zhu, Zeyuan
    Li, Yuanzhi
    [J]. 2017 IEEE 58TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2017, : 487 - 492
  • [5] [Anonymous], 2011, NATURE PREC, DOI [DOI 10.1038/NPRE.2011.5627.1, DOI 10.1038/NPRE.2011.5627]
  • [6] Communication Efficient Distributed Kernel Principal Component Analysis
    Balcan, Maria-Florina
    Liang, Yingyu
    Song, Le
    Woodruff, David
    Xie, Bo
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 725 - 734
  • [7] Balcan Maria-Florina, 2014, Advances in Neural Information Processing Systems, V27
  • [8] Bekkerman R., 2011, Scaling up machine learning: Parallel and distributed approaches, DOI [DOI 10.1017/CBO9781139042918, 10.1017/CBO9781139042918]
  • [9] Bhaskara A, 2019, ADV NEUR IN, V32
  • [10] Resolving the sign ambiguity in the singular, value decomposition
    Bro, R.
    Acar, E.
    Kolda, Tamara G.
    [J]. JOURNAL OF CHEMOMETRICS, 2008, 22 (1-2) : 135 - 140