lp Regularized low-rank approximation via iterative reweighted singular value minimization

被引:0
作者
Lu, Zhaosong [1 ]
Zhang, Yong [2 ]
Lu, Jian [3 ]
机构
[1] Simon Fraser Univ, Dept Math, Burnaby, BC, Canada
[2] Colin Artificial Intelligence Lab Ltd, Richmond, BC, Canada
[3] Shenzhen Univ, Coll Math & Computat Sci, Shenzhen, Peoples R China
基金
加拿大自然科学与工程研究理事会; 中国国家自然科学基金;
关键词
Low-rank approximation; Schatten-p quasi-norm regularized matrix minimization; Iterative reweighted singular value minimization; Iterative reweighted least squares; LEAST-SQUARES; VARIABLE SELECTION; NONSMOOTH ANALYSIS; RECOVERY; RELAXATION; ALGORITHM; SIGNALS;
D O I
10.1007/s10589-017-9933-6
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper we study the (or Schatten-p quasi-norm) regularized low-rank approximation problems. In particular, we introduce a class of first-order stationary points for them and show that any local minimizer of these problems must be a first-order stationary point. In addition, we derive lower bounds for the nonzero singular values of the first-order stationary points and hence also of the local minimizers of these problems. The iterative reweighted singular value minimization (IRSVM) methods are then proposed to solve these problems, whose subproblems are shown to have a closed-form solution. Compared to the analogous methods for the l(p) regularized vector minimization problems, the convergence analysis of these methods is significantly more challenging. We develop a novel approach to establishing the convergence of the IRSVM methods, which makes use of the expression of a specific solution of their subproblems and avoids the intricate issue of finding the explicit expression for the Clarke subdifferential of the objective of their subproblems. In particular, we show that any accumulation point of the sequence generated by the IRSVM methods is a first-order stationary point of the problems. Our computational results demonstrate that the IRSVM methods generally outperform the recently developed iterative reweighted least squares methods in terms of solution quality and/or speed.
引用
收藏
页码:619 / 642
页数:24
相关论文
共 47 条
[41]   SHAPE AND MOTION FROM IMAGE STREAMS UNDER ORTHOGRAPHY - A FACTORIZATION METHOD [J].
TOMASI, C ;
KANADE, T .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 1992, 9 (02) :137-154
[42]   ORTHOGONAL RANK-ONE MATRIX PURSUIT FOR LOW RANK MATRIX COMPLETION [J].
Wang, Zheng ;
Lai, Ming-Jun ;
Lu, Zhaosong ;
Fan, Wei ;
Davulcu, Hasan ;
Ye, Jieping .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2015, 37 (01) :A488-A514
[43]   Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm [J].
Wen, Zaiwen ;
Yin, Wotao ;
Zhang, Yin .
Mathematical Programming Computation, 2012, 4 (04) :333-361
[44]   Sparse Reconstruction by Separable Approximation [J].
Wright, Stephen J. ;
Nowak, Robert D. ;
Figueiredo, Mario A. T. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2009, 57 (07) :2479-2493
[45]   A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery [J].
Yue, Man-Chung ;
So, Anthony Man-Cho .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2016, 40 (02) :396-416
[46]   NEARLY UNBIASED VARIABLE SELECTION UNDER MINIMAX CONCAVE PENALTY [J].
Zhang, Cun-Hui .
ANNALS OF STATISTICS, 2010, 38 (02) :894-942
[47]  
Zhang T, 2010, J MACH LEARN RES, V11, P1081