A Comprehensively Tight Analysis of Gradient Descent for PCA

被引:0
作者
Xu, Zhiqiang [1 ]
Li, Ping
机构
[1] Baidu Res, Cognit Comp Lab, 10 Xibeiwang East Rd, Beijing 100193, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) | 2021年 / 34卷
关键词
OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study the Riemannian gradient method for PCA on which a crucial fact is that despite the simplicity of the considered setting, i.e., deterministic version of Krasulina's method, the convergence rate has not been well-understood yet. In this work, we provide a general tight analysis for the gap-dependent rate at O(1/Delta log 1/is an element of) that holds for any real symmetric matrix. More importantly, when the gap is significantly smaller than the target accuracy. on the objective suboptimality of the final solution, the rate of this type is actually not tight any more, which calls for a worst-case rate. We further give the first worst-case analysis that achieves a rate of convergence at O(1/is an element of log 1/is an element of. The two analyses naturally roll out a comprehensively tight convergence rate at O(1/max{Delta,}log 1/is an element of). Particularly, our gap-dependent analysis suggests a new promising learning rate for stochastic variance reduced PCA algorithms. Experiments are conducted to confirm our findings as well.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] A Tight Convergence Analysis for Stochastic Gradient Descent with Delayed Updates
    Arjevani, Yossi
    Shamir, Ohad
    Srebro, Nathan
    [J]. ALGORITHMIC LEARNING THEORY, VOL 117, 2020, 117 : 111 - 132
  • [2] Tight Convergence Rate of Gradient Descent for Eigenvalue Computation
    Ding, Qinghua
    Zhou, Kaiwen
    Cheng, James
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3276 - 3282
  • [3] Finite-Time Analysis of Markov Gradient Descent
    Doan, Thinh T.
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (04) : 2140 - 2153
  • [4] Adjusted stochastic gradient descent for latent factor analysis
    Li, Qing
    Xiong, Diwen
    Shang, Mingsheng
    [J]. INFORMATION SCIENCES, 2022, 588 : 196 - 213
  • [5] A Hybrid Model of Fuzzy Logic and Grey Relation Analysis to Evaluate Tight Gas Formation Quality Comprehensively
    Zeng, Fanhui
    Guo, Jianchun
    Long, Chuan
    [J]. JOURNAL OF GREY SYSTEM, 2015, 27 (03) : 87 - 98
  • [6] Laplacian smoothing gradient descent
    Stanley Osher
    Bao Wang
    Penghang Yin
    Xiyang Luo
    Farzin Barekat
    Minh Pham
    Alex Lin
    [J]. Research in the Mathematical Sciences, 2022, 9
  • [7] ON THE CONVERGENCE OF DECENTRALIZED GRADIENT DESCENT
    Yuan, Kun
    Ling, Qing
    Yin, Wotao
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (03) : 1835 - 1854
  • [8] GGD: Grafting Gradient Descent
    Feng, Yanjing
    Zhou, Yongdao
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [9] Laplacian smoothing gradient descent
    Osher, Stanley
    Wang, Bao
    Yin, Penghang
    Luo, Xiyang
    Barekat, Farzin
    Pham, Minh
    Lin, Alex
    [J]. RESEARCH IN THE MATHEMATICAL SCIENCES, 2022, 9 (03)
  • [10] A gradient descent algorithm for LASSO
    Kim, Yongdai
    Kim, Yuwon
    Kim, Jinseog
    [J]. PREDICTION AND DISCOVERY, 2007, 443 : 73 - 82