The Trimmed Lasso: Sparse Recovery Guarantees and Practical Optimization by the Generalized Soft-Min Penalty

被引:9
作者
Amir, Tal [1 ]
Basri, Ronen [1 ]
Nadler, Boaz [1 ]
机构
[1] Weizmann Inst Sci, Dept Comp Sci & Appl Math, IL-7610001 Rehovot, Israel
来源
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE | 2021年 / 3卷 / 03期
关键词
best subset selection; sparse recovery; trimmed lasso; nonseparable penalty; NONCONCAVE PENALIZED LIKELIHOOD; ROBUST UNCERTAINTY PRINCIPLES; RESTRICTED ISOMETRY PROPERTY; REWEIGHTED LEAST-SQUARES; VARIABLE SELECTION; SIGNAL RECOVERY; LINEAR-MODELS; ALGORITHMS; MINIMIZATION; RECONSTRUCTION;
D O I
10.1137/20M1330634
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We present a new approach to solve the sparse approximation or best subset selection problem, namely find a k-sparse vector x is an element of R-d that minimizes the l(2) residual parallel to Ax - y parallel to(2). We consider a regularized approach, whereby this residual is penalized by the nonconvex trimmed lasso, defined as the l(1)-norm of x excluding its k largest-magnitude entries. We prove that the trimmed lasso has several appealing theoretical properties, and in particular derive sparse recovery guarantees assuming successful optimization of the penalized objective. Next, we show empirically that directly optimizing this objective can be quite challenging. Instead, we propose a surrogate for the trimmed lasso, called the generalized soft-min. This penalty smoothly interpolates between the classical lasso and the trimmed lasso, while taking into account all possible k-sparse patterns. The generalized soft-min penalty involves summation over ((d)(k)) terms, yet we derive a polynomial-time algorithm to compute it. This, in turn, yields a practical method for the original sparse approximation problem. Via simulations, we demonstrate its competitive performance compared to current state of the art.
引用
收藏
页码:900 / 929
页数:30
相关论文
共 92 条
  • [51] Hastie T., 2015, Statistical Learning with Sparsity: The Lasso and Generalizations
  • [52] Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms
    Hazimeh, Hussein
    Mazumder, Rahul
    [J]. OPERATIONS RESEARCH, 2020, 68 (05) : 1517 - 1537
  • [53] Two-level l1 minimization for compressed sensing
    Huang, Xiaolin
    Liu, Yipeng
    Shi, Lei
    Van Huffel, Sabine
    Suykens, Johan A. K.
    [J]. SIGNAL PROCESSING, 2015, 108 : 459 - 475
  • [54] Variable selection using MM algorithms
    Hunter, DR
    Li, RZ
    [J]. ANNALS OF STATISTICS, 2005, 33 (04) : 1617 - 1642
  • [55] A tutorial on MM algorithms
    Hunter, DR
    Lange, K
    [J]. AMERICAN STATISTICIAN, 2004, 58 (01) : 30 - 37
  • [56] Kincaid David, 1996, NUMERICAL ANAL MATH
  • [57] IMPROVED ITERATIVELY REWEIGHTED LEAST SQUARES FOR UNCONSTRAINED SMOOTHED lq MINIMIZATION
    Lai, Ming-Jun
    Xu, Yangyang
    Yin, Wotao
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 2013, 51 (02) : 927 - 957
  • [58] Variable selection via a combination of the L0 and L1 penalties
    Liu, Yufeng
    Wu, Yichao
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2007, 16 (04) : 782 - 798
  • [59] Lofberg J., 2004, 2004 IEEE International Symposium on Computer Aided Control Systems Design (IEEE Cat. No.04TH8770), P284, DOI 10.1109/CACSD.2004.1393890
  • [60] Sparse MRI: The application of compressed sensing for rapid MR imaging
    Lustig, Michael
    Donoho, David
    Pauly, John M.
    [J]. MAGNETIC RESONANCE IN MEDICINE, 2007, 58 (06) : 1182 - 1195