A HIGHLY EFFICIENT SEMISMOOTH NEWTON AUGMENTED LAGRANGIAN METHOD FOR SOLVING LASSO PROBLEMS

被引:112
作者
Li, Xudong [1 ]
Sun, Defeng [2 ,3 ]
Toh, Kim-Chuan [4 ]
机构
[1] Princeton Univ, Dept Operat Res & Financial Engn, Princeton, NJ 08544 USA
[2] Hong Kong Polytech Univ, Dept Appl Math, Hong Kong, Hong Kong, Peoples R China
[3] Natl Univ Singapore, Dept Math, 10 Lower Kent Ridge Rd, Singapore 119076, Singapore
[4] Natl Univ Singapore, Inst Operat Res & Analyt, 10 Lower Kent Ridge Rd, Singapore 119076, Singapore
关键词
lasso; sparse optimization; augmented Lagrangian; metric subregularity; semismoothness; Newton's method; ERROR-BOUNDS; OPTIMIZATION; SHRINKAGE; ALGORITHM; CONVERGENCE; CONTINUITY; STABILITY; GRADIENT; SMOOTH;
D O I
10.1137/16M1097572
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We develop a fast and robust algorithm for solving large-scale convex composite optimization models with an emphasis on the l(1)-regularized least squares regression (lasso) problems. Despite the fact that there exist a large number of solvers in the literature for the lasso problems, we found that no solver can efficiently handle difficult large-scale regression problems with real data. By leveraging on available error bound results to realize the asymptotic superlinear convergence property of the augmented Lagrangian algorithm, and by exploiting the second order sparsity of the problem through the semismooth Newton method, we are able to propose an algorithm, called SSNAL, to efficiently solve the aforementioned difficult problems. Under very mild conditions, which hold automatically for lasso problems, both the primal and the dual iteration sequences generated by Ssnal possess a fast linear convergence rate, which can even be superlinear asymptotically. Numerical comparisons between our approach and a number of state-of-the-art solvers, on real data sets, are presented to demonstrate the high efficiency and robustness of our proposed algorithm in solving difficult large-scale lasso problems.
引用
收藏
页码:433 / 458
页数:26
相关论文
共 57 条
  • [1] [Anonymous], 2007, Finite-dimensional variational inequalities and complementarity problems
  • [2] [Anonymous], PREPRINT
  • [3] A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
    Beck, Amir
    Teboulle, Marc
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01): : 183 - 202
  • [4] NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
    Becker, Stephen
    Bobin, Jerome
    Candes, Emmanuel J.
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2011, 4 (01): : 1 - 39
  • [5] Byrd RH, 2016, MATH PROGRAM, V159, P435, DOI 10.1007/s10107-015-0965-3
  • [6] An inexact successive quadratic approximation method for L-1 regularized optimization
    Byrd, Richard H.
    Nocedal, Jorge
    Oztoprak, Figen
    [J]. MATHEMATICAL PROGRAMMING, 2016, 157 (02) : 375 - 396
  • [7] LIBSVM: A Library for Support Vector Machines
    Chang, Chih-Chung
    Lin, Chih-Jen
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
  • [8] Atomic decomposition by basis pursuit
    Chen, SSB
    Donoho, DL
    Saunders, MA
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) : 33 - 61
  • [9] Clarke F., 1983, OPTIMIZATION NONSMOO
  • [10] Cui Y., 2016, PREPRINT