Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs

被引:0
作者
Chen, Jianhui [1 ]
Yang, Tianbao [2 ]
Zhu, Shenghuo [2 ]
机构
[1] GE Global Res, San Ramon, CA 94583 USA
[2] NEC Labs Amer, Cupertino, CA 95014 USA
来源
ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33 | 2014年 / 33卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.
引用
收藏
页码:122 / 130
页数:9
相关论文
共 50 条
[11]   Gradient Descent with Low-Rank Objective Functions [J].
Cosson, Romain ;
Jadbabaie, Ali ;
Makur, Anuran ;
Reisizadeh, Amirhossein ;
Shah, Devavrat .
2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, :3309-3314
[12]   BYZANTINE-ROBUST STOCHASTIC GRADIENT DESCENT FOR DISTRIBUTED LOW-RANK MATRIX COMPLETION [J].
He, Xuechao ;
Ling, Qing ;
Chen, Tianyi .
2019 IEEE DATA SCIENCE WORKSHOP (DSW), 2019, :322-326
[13]   Analysis of biased stochastic gradient descent using sequential semidefinite programs [J].
Hu, Bin ;
Seiler, Peter ;
Lessard, Laurent .
MATHEMATICAL PROGRAMMING, 2021, 187 (1-2) :383-408
[14]   Analysis of biased stochastic gradient descent using sequential semidefinite programs [J].
Bin Hu ;
Peter Seiler ;
Laurent Lessard .
Mathematical Programming, 2021, 187 :383-408
[15]   Convergence of Gradient Descent for Low-Rank Matrix Approximation [J].
Pitaval, Renaud-Alexandre ;
Dai, Wei ;
Tirkkonen, Olav .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2015, 61 (08) :4451-4457
[16]   Smoothed analysis of the low-rank approach for smooth semidefinite programs [J].
Pumir, Thomas ;
Jelassi, Samy ;
Boumal, Nicolas .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
[17]   Low-rank extragradient methods for scalable semidefinite optimization [J].
Garber, Dan ;
Kaplan, Atara .
OPERATIONS RESEARCH LETTERS, 2025, 60
[18]   Accelerated Factored Gradient Descent for Low-Rank Matrix Factorization [J].
Zhou, Dongruo ;
Cao, Yuan ;
Gu, Quanquan .
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 :4430-4439
[19]   Efficient Low-Rank Semidefinite Programming With Robust Loss Functions [J].
Yao, Quanming ;
Yang, Hansi ;
Hu, En-Liang ;
Kwok, James T. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (10) :6153-6168
[20]   HyLo: A Hybrid Low-Rank Natural Gradient Descent Method [J].
Mu, Baorun ;
Soori, Saeed ;
Can, Bugra ;
Gurbuzbalaban, Mert ;
Dehnavi, Maryam Mehri .
SC22: INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2022,