A Proximal-Proximal Majorization-Minimization Algorithm for Nonconvex Rank Regression Problems

被引:3
作者
Tang, Peipei [1 ]
Wang, Chengjing [2 ]
Jiang, Bo [3 ]
机构
[1] Hangzhou City Univ, Sch Comp & Comp Sci, Hangzhou 310015, Peoples R China
[2] Southwest Jiaotong Univ, Sch Math, Chengdu 611756, Peoples R China
[3] Zhejiang Univ, Sch Comp Sci, Hangzhou 310027, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Nonconvex rank regression problems; semismooth Newton method; proximal-proximal majorization-minimization algorithm; proximal point algorithm; NONCONCAVE PENALIZED LIKELIHOOD; AUGMENTED LAGRANGIAN METHOD; SQUARE-ROOT LASSO; VARIABLE SELECTION; SPARSE RECOVERY; ROBUST; MATRIX; OPTIMALITY; SHRINKAGE; SIGNALS;
D O I
10.1109/TSP.2023.3315454
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we introduce a proximal-proximal majorization-minimization (PPMM) algorithm for nonconvex rank regression problems. The basic idea of the algorithm is to apply the proximal majorization-minimization algorithm to solve the nonconvex problem with the inner subproblems solved by a sparse semismooth Newton (SSN) method based proximal point algorithm (PPA). It deserves mentioning that we adopt the sequential regularization technique and design an implementable stopping criterion to overcome the singular difficulty of the inner subproblem. Especially for the stopping criterion, it plays a very important role for the success of the algorithm. Furthermore, we also prove that the PPMM algorithm converges to a stationary point. Due to the Kurdyka-Lojasiewicz (KL) property of the problem, we present the convergence rate of the PPMM algorithm. Numerical experiments demonstrate that our proposed algorithm outperforms the existing state-of-the-art algorithms.
引用
收藏
页码:3502 / 3517
页数:16
相关论文
共 63 条
[1]   DIFFERENCE-OF-CONVEX LEARNING: DIRECTIONAL STATIONARITY, OPTIMALITY, AND SPARSITY [J].
Ahn, Miju ;
Pang, Jong-Shi ;
Xin, Jack .
SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (03) :1637-1665
[2]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[3]   Online Adaptive Estimation of Sparse Signals: Where RLS Meets the l1-Norm [J].
Angelosante, Daniele ;
Bazerque, Juan Andres ;
Giannakis, Georgios B. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (07) :3436-3447
[4]  
[Anonymous], 2010, Advances in Neural Information Processing Systems
[5]  
[Anonymous], 2018, Convergence of an Inexact MajorizationMinimization Method for Solving a Class of Composite Optimization Problems, DOI DOI 10.1007/978-3-319-97478-113
[6]   Compressed channel sensing [J].
Bajwa, Waheed U. ;
Haupt, Jarvis ;
Raz, Gil ;
Nowak, Robert .
2008 42ND ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS, VOLS 1-3, 2008, :5-+
[7]   Compressive sensing [J].
Baraniuk, Richard G. .
IEEE SIGNAL PROCESSING MAGAZINE, 2007, 24 (04) :118-+
[8]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[9]   SLOPE MEETS LASSO: IMPROVED ORACLE BOUNDS AND OPTIMALITY [J].
Bellec, Pierre C. ;
Lecue, Guillaume ;
Tsybakov, Alexandre B. .
ANNALS OF STATISTICS, 2018, 46 (6B) :3603-3642
[10]   Square-root lasso: pivotal recovery of sparse signals via conic programming [J].
Belloni, A. ;
Chernozhukov, V. ;
Wang, L. .
BIOMETRIKA, 2011, 98 (04) :791-806