A communication-efficient and privacy-aware distributed algorithm for sparse PCA

被引:0
|
作者
Lei Wang
Xin Liu
Yin Zhang
机构
[1] Academy of Mathematics and Systems Science,State Key Laboratory of Scientific and Engineering Computing
[2] University of Chinese Academy of Sciences,School of Mathematical Sciences
[3] The Chinese University of Hong Kong,School of Data Science
来源
Computational Optimization and Applications | 2023年 / 85卷
关键词
Alternating direction method of multipliers; Distributed computing; Optimization with orthogonality constraints; Sparse PCA;
D O I
暂无
中图分类号
学科分类号
摘要
Sparse principal component analysis (PCA) improves interpretability of the classic PCA by introducing sparsity into the dimension-reduction process. Optimization models for sparse PCA, however, are generally non-convex, non-smooth and more difficult to solve, especially on large-scale datasets requiring distributed computation over a wide network. In this paper, we develop a distributed and centralized algorithm called DSSAL1 for sparse PCA that aims to achieve low communication overheads by adapting a newly proposed subspace-splitting strategy to accelerate convergence. Theoretically, convergence to stationary points is established for DSSAL1. Extensive numerical results show that DSSAL1 requires far fewer rounds of communication than state-of-the-art peer methods. In addition, we make the case that since messages exchanged in DSSAL1 are well-masked, the possibility of private-data leakage in DSSAL1 is much lower than in some other distributed algorithms.
引用
收藏
页码:1033 / 1072
页数:39
相关论文
empty
未找到相关数据