The Trimmed Lasso: Sparse Recovery Guarantees and Practical Optimization by the Generalized Soft-Min Penalty

被引:9
作者
Amir, Tal [1 ]
Basri, Ronen [1 ]
Nadler, Boaz [1 ]
机构
[1] Weizmann Inst Sci, Dept Comp Sci & Appl Math, IL-7610001 Rehovot, Israel
来源
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE | 2021年 / 3卷 / 03期
关键词
best subset selection; sparse recovery; trimmed lasso; nonseparable penalty; NONCONCAVE PENALIZED LIKELIHOOD; ROBUST UNCERTAINTY PRINCIPLES; RESTRICTED ISOMETRY PROPERTY; REWEIGHTED LEAST-SQUARES; VARIABLE SELECTION; SIGNAL RECOVERY; LINEAR-MODELS; ALGORITHMS; MINIMIZATION; RECONSTRUCTION;
D O I
10.1137/20M1330634
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We present a new approach to solve the sparse approximation or best subset selection problem, namely find a k-sparse vector x is an element of R-d that minimizes the l(2) residual parallel to Ax - y parallel to(2). We consider a regularized approach, whereby this residual is penalized by the nonconvex trimmed lasso, defined as the l(1)-norm of x excluding its k largest-magnitude entries. We prove that the trimmed lasso has several appealing theoretical properties, and in particular derive sparse recovery guarantees assuming successful optimization of the penalized objective. Next, we show empirically that directly optimizing this objective can be quite challenging. Instead, we propose a surrogate for the trimmed lasso, called the generalized soft-min. This penalty smoothly interpolates between the classical lasso and the trimmed lasso, while taking into account all possible k-sparse patterns. The generalized soft-min penalty involves summation over ((d)(k)) terms, yet we derive a polynomial-time algorithm to compute it. This, in turn, yields a practical method for the original sparse approximation problem. Via simulations, we demonstrate its competitive performance compared to current state of the art.
引用
收藏
页码:900 / 929
页数:30
相关论文
共 92 条
  • [1] Restricted Isometry Property for General p-Norms
    Allen-Zhu, Zeyuan
    Gelashvili, Rati
    Razenshteyn, Ilya
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (10) : 5839 - 5854
  • [2] [Anonymous], 2008, EM ALGORITHM EXTENSI
  • [3] [Anonymous], 2005, IEEE Int. Conf. Image Proc
  • [4] [Anonymous], 1999, HPL-1999-124
  • [5] ApS M., 2019, US GUID REF MAN VERS
  • [6] Arthanari T.S., 1981, Mathematical Programming in Statis tics
  • [7] Fast and Accurate Algorithms for Re-Weighted l1-Norm Minimization
    Asif, M. Salman
    Romberg, Justin
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2013, 61 (23) : 5905 - 5916
  • [8] A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
    Beck, Amir
    Teboulle, Marc
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01): : 183 - 202
  • [9] Combining geometry and combinatorics: a unified approach to sparse signal recovery
    Berinde, R.
    Gilbert, A. C.
    Indyk, P.
    Karloff, H.
    Strauss, M. J.
    [J]. 2008 46TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, VOLS 1-3, 2008, : 798 - +
  • [10] Berrada Leonard, 2018, INT C LEARN REPR