A Non-Euclidean Gradient Descent Framework for Non-Convex Matrix Factorization

被引:8
作者
Hsieh, Ya-Ping [1 ]
Kao, Yu-Chun [2 ]
Mahabadi, Rabeeh Karimi [1 ]
Yurtsever, Alp [3 ]
Kyrillidis, Anastasios [4 ]
Cevher, Volkan [5 ]
机构
[1] Ecole Polytech Fed Lausanne, Lab Informat & Inference Syst, CH-1015 Lausanne, Switzerland
[2] Univ Michigan, Ann Arbor, MI 48109 USA
[3] Ecole Polytech Fed Lausanne, Sch Comp & Commun Sci, CH-1015 Lausanne, Switzerland
[4] Rice Univ, Dept Comp Sci, Houston, TX 77005 USA
[5] Ecole Polytech Fed Lausanne, Dept Elect Engn, CH-1015 Lausanne, Switzerland
基金
瑞士国家科学基金会;
关键词
Non-convex optimization; low-rank approximation; non-Euclidean gradient descent; PHASE RETRIEVAL; SEMIDEFINITE; OPTIMIZATION; MINIMIZATION;
D O I
10.1109/TSP.2018.2870353
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We study convex optimization problems that feature low-rank matrix solutions. In such scenarios, non-convex methods offer significant advantages over convex methods due to their lower space complexity, as well as practical faster convergence. Under mild assumptions, these methods feature global convergence guarantees. In this paper, we extend the results on this matter by following a different path. We derive a non-Euclidean optimization framework in the non-convex setting that takes nonlinear gradient steps On the factors. Our framework enables the possibility to further exploit the underlying problem structures, such as sparsity or low-rankness on the factorized domain, or better dimensional dependence of the smoothness parameters of the objectives. We prove that the non-Euclidean methods enjoy the same rigorous guarantees as their Euclidean counterparts under appropriate assumptions. Numerical evidence with Fourier ptychography and FastText applications, using real data, shows that our approach can enhance solution quality, as well as convergence speed over the standard non-convex approaches.
引用
收藏
页码:5917 / 5926
页数:10
相关论文
共 55 条
[1]   The learnability of quantum states [J].
Aaronson, Scott .
PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2007, 463 (2088) :3089-3114
[2]  
Absil PA, 2008, OPTIMIZATION ALGORITHMS ON MATRIX MANIFOLDS, P1
[3]  
[Anonymous], 2016, arXiv
[4]  
[Anonymous], ARXIV160603196
[5]  
[Anonymous], 2016, C LEARNING THEORY
[6]  
[Anonymous], 2016, PR MACH LEARN RES
[7]  
[Anonymous], 2013, Advances in Neural Information Processing Systems
[8]  
[Anonymous], 2012, GEOMETRY BANACH SPAC
[9]  
[Anonymous], 2016, PR MACH LEARN RES
[10]  
[Anonymous], P 28 INT C ART INT