Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs

被引:0
作者
Chen, Jianhui [1 ]
Yang, Tianbao [2 ]
Zhu, Shenghuo [2 ]
机构
[1] GE Global Res, San Ramon, CA 94583 USA
[2] NEC Labs Amer, Cupertino, CA 95014 USA
来源
ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33 | 2014年 / 33卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.
引用
收藏
页码:122 / 130
页数:9
相关论文
共 50 条
[21]   Low-rank structure in semidefinite programs derived from the KYP lemma [J].
Liu, Zhang ;
Vandenberghe, Lieven .
PROCEEDINGS OF THE 46TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-14, 2007, :2056-2063
[22]   Low-Rank Gradient Descent for Memory-Efficient Training of Deep In-Memory Arrays [J].
Huang, Siyuan ;
Hoskins, Brian D. ;
Daniels, Matthew W. ;
Stiles, Mark D. ;
Adam, Gina C. .
ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2023, 19 (02)
[23]   Solving stochastic systems with low-rank tensor compression [J].
Matthies, Hermann G. ;
Zander, Elmar .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2012, 436 (10) :3819-3838
[24]   Low-rank quadratic semidefinite programming [J].
Yuan, Ganzhao ;
Zhang, Zhenjie ;
Ghanem, Bernard ;
Hao, Zhifeng .
NEUROCOMPUTING, 2013, 106 :51-60
[25]   Global Convergence of Gradient Descent for Asymmetric Low-Rank Matrix Factorization [J].
Ye, Tian ;
Du, Simon S. .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
[26]   Greedy rank updates combined with Riemannian descent methods for low-rank optimization [J].
Uschmajew, Andre ;
Vandereycken, Bart .
2015 INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2015, :420-424
[27]   A PRECONDITIONED RIEMANNIAN GRADIENT DESCENT ALGORITHM FOR LOW-RANK MATRIX RECOVERY [J].
Bian, Fengmiao ;
Cai, Jian-feng ;
Zhang, Rui .
SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2024, 45 (04) :2075-2103
[28]   An equivalent nonlinear optimization model with triangular low-rank factorization for semidefinite programs [J].
Yamakawa, Yuya ;
Ikegami, Tetsuya ;
Fukuda, Ellen H. H. ;
Yamashita, Nobuo .
OPTIMIZATION METHODS & SOFTWARE, 2023, 38 (06) :1296-1310
[29]   Solving PhaseLift by Low-rank Riemannian Optimization Methods [J].
Huang, Wen ;
Gallivan, Kyle A. ;
Zhang, Xiangxiong .
INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE 2016 (ICCS 2016), 2016, 80 :1125-1134
[30]   LOW-RANK SOLUTION METHODS FOR STOCHASTIC EIGENVALUE PROBLEMS [J].
Elman, Howard C. ;
Su, Tengfei .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2019, 41 (04) :A2657-A2680