lp Regularized low-rank approximation via iterative reweighted singular value minimization

被引:0
作者
Lu, Zhaosong [1 ]
Zhang, Yong [2 ]
Lu, Jian [3 ]
机构
[1] Simon Fraser Univ, Dept Math, Burnaby, BC, Canada
[2] Colin Artificial Intelligence Lab Ltd, Richmond, BC, Canada
[3] Shenzhen Univ, Coll Math & Computat Sci, Shenzhen, Peoples R China
基金
加拿大自然科学与工程研究理事会; 中国国家自然科学基金;
关键词
Low-rank approximation; Schatten-p quasi-norm regularized matrix minimization; Iterative reweighted singular value minimization; Iterative reweighted least squares; LEAST-SQUARES; VARIABLE SELECTION; NONSMOOTH ANALYSIS; RECOVERY; RELAXATION; ALGORITHM; SIGNALS;
D O I
10.1007/s10589-017-9933-6
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper we study the (or Schatten-p quasi-norm) regularized low-rank approximation problems. In particular, we introduce a class of first-order stationary points for them and show that any local minimizer of these problems must be a first-order stationary point. In addition, we derive lower bounds for the nonzero singular values of the first-order stationary points and hence also of the local minimizers of these problems. The iterative reweighted singular value minimization (IRSVM) methods are then proposed to solve these problems, whose subproblems are shown to have a closed-form solution. Compared to the analogous methods for the l(p) regularized vector minimization problems, the convergence analysis of these methods is significantly more challenging. We develop a novel approach to establishing the convergence of the IRSVM methods, which makes use of the expression of a specific solution of their subproblems and avoids the intricate issue of finding the explicit expression for the Clarke subdifferential of the objective of their subproblems. In particular, we show that any accumulation point of the sequence generated by the IRSVM methods is a first-order stationary point of the problems. Our computational results demonstrate that the IRSVM methods generally outperform the recently developed iterative reweighted least squares methods in terms of solution quality and/or speed.
引用
收藏
页码:619 / 642
页数:24
相关论文
共 47 条
[1]  
[Anonymous], 2008, 33 INT C AC SPEECH S
[2]  
[Anonymous], 2009, DANISH POWER SYSTEM
[3]  
[Anonymous], 2000, P 27 ANN C COMP GRAP
[4]   2-POINT STEP SIZE GRADIENT METHODS [J].
BARZILAI, J ;
BORWEIN, JM .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1988, 8 (01) :141-148
[5]   WORST-CASE COMPLEXITY OF SMOOTHING QUADRATIC REGULARIZATION METHODS FOR NON-LIPSCHITZIAN OPTIMIZATION [J].
Bian, Wei ;
Chen, Xiaojun .
SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (03) :1718-1741
[6]   Nonmonotone spectral projected gradient methods on convex sets [J].
Birgin, EG ;
Martínez, JM ;
Raydan, M .
SIAM JOURNAL ON OPTIMIZATION, 2000, 10 (04) :1196-1211
[7]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[8]   Exact Matrix Completion via Convex Optimization [J].
Candes, Emmanuel J. ;
Recht, Benjamin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) :717-772
[9]   Enhancing Sparsity by Reweighted l1 Minimization [J].
Candes, Emmanuel J. ;
Wakin, Michael B. ;
Boyd, Stephen P. .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) :877-905
[10]   Exact reconstruction of sparse signals via nonconvex minimization [J].
Chartrand, Rick .
IEEE SIGNAL PROCESSING LETTERS, 2007, 14 (10) :707-710