A Laplacian approach to l1-norm minimization

被引:0
作者
Bonifaci, Vincenzo [1 ]
机构
[1] Univ Roma Tre, Dipartimento Matemat & Fis, Rome, Italy
关键词
l(1) regression; Basis pursuit; Iteratively reweighted least squares; Multiplicative weights; Laplacian paradigm; Convex optimization; REWEIGHTED LEAST-SQUARES; 1ST-ORDER METHODS; ALGORITHMS; L(1)-MINIMIZATION; DECOMPOSITION; OPTIMIZATION; CONVERGENCE; DESCENT; SYSTEMS; ROBUST;
D O I
10.1007/s10589-021-00270-x
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We propose a novel differentiable reformulation of the linearly-constrained l(1) minimization problem, also known as the basis pursuit problem. The reformulation is inspired by the Laplacian paradigm of network theory and leads to a new family of gradient-based methods for the solution of l(1) minimization problems. We analyze the iteration complexity of a natural solution approach to the reformulation, based on a multiplicative weights update scheme, as well as the iteration complexity of an accelerated gradient scheme. The results can be seen as bounds on the complexity of iteratively reweighted least squares (IRLS) type methods of basis pursuit.
引用
收藏
页码:441 / 469
页数:29
相关论文
共 48 条
  • [1] Hessian Riemannian gradient flows in convex programming
    Alvarez, F
    Bolte, J
    Brahic, O
    [J]. SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 2004, 43 (02) : 477 - 501
  • [2] Amari S., 2016, INF GEOM APPL
  • [3] [Anonymous], 2016, P 27 ANN ACM SIAM S, DOI DOI 10.1137/1.9781611974331.CH131
  • [4] [Anonymous], 1998, EVOLUTIONARY GAMES P
  • [5] Arora S., 2012, Theory Comput, V8, P121
  • [6] Optimization with Sparsity-Inducing Penalties
    Bach, Francis
    Jenatton, Rodolphe
    Mairal, Julien
    Obozinski, Guillaume
    [J]. FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2012, 4 (01): : 1 - 106
  • [7] Potential-Function Proofs for Gradient Methods
    Bansal, Nikhil
    Gupta, Anupam
    [J]. THEORY OF COMPUTING, 2019, 15
  • [8] A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
    Bauschke, Heinz H.
    Bolte, Jerome
    Teboulle, Marc
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2017, 42 (02) : 330 - 348
  • [9] Mirror descent and nonlinear projected subgradient methods for convex optimization
    Beck, A
    Teboulle, M
    [J]. OPERATIONS RESEARCH LETTERS, 2003, 31 (03) : 167 - 175
  • [10] Beck A., 2017, MOS SIAM SERIES OPTI