Newton-Type Greedy Selection Methods for l0-Constrained Minimization

被引:15
作者
Yuan, Xiao-Tong [1 ]
Liu, Qingshan [1 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Jiangsu Prov Key Lab Big Data Anal Technol, Nanjing 210044, Jiangsu, Peoples R China
关键词
Sparsity; Newton methods; greedy selection; optimization; M-estimation; SIGNAL RECOVERY; SPARSITY; REPRESENTATIONS; APPROXIMATION; PURSUIT;
D O I
10.1109/TPAMI.2017.2651813
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce a family of Newton-type greedy selection methods for l(0)-constrained minimization problems. The basic idea is to construct a quadratic function to approximate the original objective function around the current iterate and solve the constructed quadratic program over the cardinality constraint. The next iterate is then estimated via a line search operation between the current iterate and the solution of the sparse quadratic program. This iterative procedure can be interpreted as an extension of the constrained Newton methods from convex minimization to non-convex l(0)-constrained minimization. We show that the proposed algorithms converge asymptotically and the rate of local convergence is superlinear up to certain estimation error. Our methods compare favorably against several state-of-the-art greedy selection methods when applied to sparse logistic regression and sparse support vector machines.
引用
收藏
页码:2437 / 2450
页数:14
相关论文
共 37 条
[1]  
[Anonymous], 1999, Athena scientific Belmont
[2]  
[Anonymous], 2014, Advances in Neural Information Processing Systems
[3]  
[Anonymous], 2016, ADV NEURAL INF PROCE
[4]  
[Anonymous], 2010, ARXIV PREPRINT ARXIV
[5]  
[Anonymous], 2011, Adv Neural Inf Process Syst
[6]  
[Anonymous], 2004, CONVEX OPTIMIZATION
[7]  
[Anonymous], 1993, SIGN SYST COMP 1993
[8]  
[Anonymous], 2011, Advances in Neural Information Processing Systems
[9]  
Bahmani S, 2013, J MACH LEARN RES, V14, P807
[10]  
BECKER S., 2012, NIPS 12, V2, P2618