A Comprehensively Tight Analysis of Gradient Descent for PCA

被引:0
作者
Xu, Zhiqiang [1 ]
Li, Ping
机构
[1] Baidu Res, Cognit Comp Lab, 10 Xibeiwang East Rd, Beijing 100193, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) | 2021年 / 34卷
关键词
OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study the Riemannian gradient method for PCA on which a crucial fact is that despite the simplicity of the considered setting, i.e., deterministic version of Krasulina's method, the convergence rate has not been well-understood yet. In this work, we provide a general tight analysis for the gap-dependent rate at O(1/Delta log 1/is an element of) that holds for any real symmetric matrix. More importantly, when the gap is significantly smaller than the target accuracy. on the objective suboptimality of the final solution, the rate of this type is actually not tight any more, which calls for a worst-case rate. We further give the first worst-case analysis that achieves a rate of convergence at O(1/is an element of log 1/is an element of. The two analyses naturally roll out a comprehensively tight convergence rate at O(1/max{Delta,}log 1/is an element of). Particularly, our gap-dependent analysis suggests a new promising learning rate for stochastic variance reduced PCA algorithms. Experiments are conducted to confirm our findings as well.
引用
收藏
页数:12
相关论文
共 50 条
[31]   Online Gradient Descent for Linear Dynamical Systems [J].
Nonhoff, Marko ;
Mueller, Matthias A. .
IFAC PAPERSONLINE, 2020, 53 (02) :945-952
[32]   Optimal stochastic gradient descent algorithm for filtering [J].
Turali, M. Yigit ;
Koc, Ali T. ;
Kozat, Suleyman S. .
DIGITAL SIGNAL PROCESSING, 2024, 155
[33]   Asymptotic Analysis via Stochastic Differential Equations of Gradient Descent Algorithms in Statistical and Computational Paradigms [J].
Wang, Yazhen ;
Wu, Shang .
JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
[34]   Cost-sensitive boosting algorithms as gradient descent [J].
Cai, Qu-Tang ;
Song, Yang-Qui ;
Zhang, Chang-Shui .
2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, :2009-2012
[35]   Sign Based Derivative Filtering for Stochastic Gradient Descent [J].
Berestizshevsky, Konstantin ;
Even, Guy .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 :208-219
[36]   Learning Deep Gradient Descent Optimization for Image Deconvolution [J].
Gong, Dong ;
Zhang, Zhen ;
Shi, Qinfeng ;
van den Hengel, Anton ;
Shen, Chunhua ;
Zhang, Yanning .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (12) :5468-5482
[37]   On Gradient Descent Algorithm for Generalized Phase Retrieval Problem [J].
Ji, Li ;
Tie, Zhou .
PROCEEDINGS OF 2016 IEEE 13TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP 2016), 2016, :320-325
[38]   A gradient descent algorithm built on approximate discrete gradients [J].
Moreschini, Alessio ;
Mattioni, Mattia ;
Monaco, Salvatore ;
Normand-Cyrot, Dorothee .
2022 26TH INTERNATIONAL CONFERENCE ON SYSTEM THEORY, CONTROL AND COMPUTING (ICSTCC), 2022, :343-348
[39]   Data-Dependent Bounds on Network Gradient Descent [J].
Bijral, Avleen ;
Sarwate, Anand D. ;
Srebro, Nathan .
2016 54TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2016, :869-874
[40]   Optimized Binary Patterns by Gradient Descent for Ghost Imaging [J].
Hoshi, Ikuo ;
Shimobaba, Tomoyoshi ;
Kakue, Takashi ;
Ito, Tomoyoshi .
IEEE ACCESS, 2021, 9 :97320-97326