Seeking Consensus on Subspaces in Federated Principal Component Analysis

被引:0
|
作者
Wang, Lei [1 ]
Liu, Xin [2 ,3 ]
Zhang, Yin [4 ]
机构
[1] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
[2] Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Beijing, Peoples R China
[4] Chinese Univ Hong Kong, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Alternating direction method of multipliers; Federated learning; Principal component analysis; Orthogonality constraints; SIMULTANEOUS-ITERATION; OPTIMIZATION PROBLEMS; FRAMEWORK; ALGORITHM; SVD;
D O I
10.1007/s10957-024-02523-1
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we develop an algorithm for federated principal component analysis (PCA) with emphases on both communication efficiency and data privacy. Generally speaking, federated PCA algorithms based on direct adaptations of classic iterative methods, such as simultaneous subspace iterations, are unable to preserve data privacy, while algorithms based on variable-splitting and consensus-seeking, such as alternating direction methods of multipliers (ADMM), lack in communication-efficiency. In this work, we propose a novel consensus-seeking formulation by equalizing subspaces spanned by splitting variables instead of equalizing variables themselves, thus greatly relaxing feasibility restrictions and allowing much faster convergence. Then we develop an ADMM-like algorithm with several special features to make it practically efficient, including a low-rank multiplier formula and techniques for treating subproblems. We establish that the proposed algorithm can better protect data privacy than classic methods adapted to the federated PCA setting. We derive convergence results, including a worst-case complexity estimate, for the proposed ADMM-like algorithm in the presence of the nonlinear equality constraints. Extensive empirical results are presented to show that the new algorithm, while enhancing data privacy, requires far fewer rounds of communication than existing peer algorithms for federated PCA.
引用
收藏
页码:529 / 561
页数:33
相关论文
共 50 条
  • [21] Parameterized principal component analysis
    Gupta, Ajay
    Barbu, Adrian
    PATTERN RECOGNITION, 2018, 78 : 215 - 227
  • [22] Ensemble Principal Component Analysis
    Dorabiala, Olga
    Aravkin, Aleksandr Y.
    Kutz, J. Nathan
    IEEE ACCESS, 2024, 12 : 6663 - 6671
  • [23] Regularized Principal Component Analysis
    Yonathan AFLALO
    Ron KIMMEL
    Chinese Annals of Mathematics,Series B, 2017, (01) : 1 - 12
  • [24] Bayesian principal component analysis
    Nounou, MN
    Bakshi, BR
    Goel, PK
    Shen, XT
    JOURNAL OF CHEMOMETRICS, 2002, 16 (11) : 576 - 595
  • [25] A PRINCIPAL COMPONENT ANALYSIS FOR TREES
    Aydin, Burcu
    Pataki, Gabor
    Wang, Haonan
    Bullitt, Elizabeth
    Marron, J. S.
    ANNALS OF APPLIED STATISTICS, 2009, 3 (04) : 1597 - 1615
  • [26] Principal component spectral analysis
    Guo, Hao
    Marfurt, Kurt J.
    Liu, Jianlei
    GEOPHYSICS, 2009, 74 (04) : P35 - P43
  • [27] Adaptive Principal Component Analysis
    Li, Xiangyu
    Wang, Hua
    PROCEEDINGS OF THE 2022 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2022, : 486 - 494
  • [28] A robust principal component analysis
    Ibazizen, M
    Dauxois, J
    STATISTICS, 2003, 37 (01) : 73 - 83
  • [29] On coMADs and Principal Component Analysis
    Kazempour, Daniyal
    Huenemoerder, M. A. X.
    Seidl, Thomas
    SIMILARITY SEARCH AND APPLICATIONS (SISAP 2019), 2019, 11807 : 273 - 280
  • [30] Regularized principal component analysis
    Yonathan Aflalo
    Ron Kimmel
    Chinese Annals of Mathematics, Series B, 2017, 38 : 1 - 12