A SMOOTHING PROXIMAL GRADIENT ALGORITHM FOR NONSMOOTH CONVEX REGRESSION WITH CARDINALITY PENALTY

被引:57
作者
Bian, Wei [1 ,2 ]
Chen, Xiaojun [3 ]
机构
[1] Harbin Inst Technol, Sch Math, Harbin, Peoples R China
[2] Harbin Inst Technol, Inst Adv Study Math, Harbin 150001, Peoples R China
[3] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
nonsmooth convex regression; cardinality penalty; proximal gradient method; smoothing method; global sequence convergence; OPTIMIZATION PROBLEMS; VARIABLE SELECTION; LEAST-SQUARES; OPTIMALITY; SPARSITY; MINIMIZATION; DIFFERENCE; SIGNAL; VIEW;
D O I
10.1137/18M1186009
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we focus on the constrained sparse regression problem, where the loss function is convex but nonsmooth and the penalty term is defined by the cardinality function. First, we give an exact continuous relaxation problem in the sense that both problems have the same optimal solution set. Moreover, we show that a vector is a local minimizer with the lower bound property of the original problem if and only if it is a lifted stationary point of the relaxation problem. Second, we propose a smoothing proximal gradient (SPG) algorithm for finding a lifted stationary point of the continuous relaxation model. Our algorithm is a novel combination of the classical proximal gradient algorithm and the smoothing method. We prove that the proposed SPG algorithm globally converges to a lifted stationary point of the relaxation problem, has the local convergence rate of o(k(-tau)) with tau is an element of (0, 1/2) on the objective function value, and identifies the zero entries of the lifted stationary point in finite iterations. Finally, we use three examples to illustrate the validity of the continuous relaxation model and good numerical performance of the SPG algorithm.
引用
收藏
页码:858 / 883
页数:26
相关论文
共 56 条
[1]   DIFFERENCE-OF-CONVEX LEARNING: DIRECTIONAL STATIONARITY, OPTIMALITY, AND SPARSITY [J].
Ahn, Miju ;
Pang, Jong-Shi ;
Xin, Jack .
SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (03) :1637-1665
[2]  
[Anonymous], 1970, Princeton mathematical series
[3]   Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods [J].
Attouch, Hedy ;
Bolte, Jerome ;
Svaiter, Benar Fux .
MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) :91-129
[4]   Optimization problems involving group sparsity terms [J].
Beck, Amir ;
Hallak, Nadav .
MATHEMATICAL PROGRAMMING, 2019, 178 (1-2) :39-67
[5]   PROXIMAL MAPPING FOR SYMMETRIC PENALTY AND SPARSITY [J].
Beck, Amir ;
Hallak, Nadav .
SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (01) :496-527
[6]   Optimality and Complexity for Constrained Optimization Problems with Nonconvex Regularization [J].
Bian, Wei ;
Chen, Xiaojun .
MATHEMATICS OF OPERATIONS RESEARCH, 2017, 42 (04) :1063-1084
[7]   Linearly Constrained Non-Lipschitz Optimization for Image Restoration [J].
Bian, Wei ;
Chen, Xiaojun .
SIAM JOURNAL ON IMAGING SCIENCES, 2015, 8 (04) :2294-2322
[8]   Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization [J].
Bian, Wei ;
Chen, Xiaojun ;
Ye, Yinyu .
MATHEMATICAL PROGRAMMING, 2015, 149 (1-2) :301-327
[9]   Iterative Thresholding for Sparse Approximations [J].
Blumensath, Thomas ;
Davies, Mike E. .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) :629-654
[10]   Censored regression quantiles with endogenous regressors [J].
Blundell, Richard ;
Powell, James L. .
JOURNAL OF ECONOMETRICS, 2007, 141 (01) :65-83