Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate

被引:0
作者
Kuemmerle, Christian [1 ]
Verdun, Claudio Mayrink [2 ,3 ]
Stoeger, Dominik [4 ]
机构
[1] Johns Hopkins Univ, Dept Appl Math & Stat, Baltimore, MD 21218 USA
[2] Tech Univ Munich, Dept Math, Munich, Germany
[3] Tech Univ Munich, Dept Elect & Comp Engn, Munich, Germany
[4] KU Eichstatt Ingolstadt, Dept Math, Eichstatt, Germany
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) | 2021年 / 34卷
关键词
SIGNAL RECOVERY; ALGORITHMS; ROBUST; MINIMIZATION; RECONSTRUCTION; SELECTION; PROPERTY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The recovery of sparse data is at the core of many applications in machine learning and signal processing. While such problems can be tackled using l(1)-regularization as in the LASSO estimator and in the Basis Pursuit approach, specialized algorithms are typically required to solve the corresponding high-dimensional non-smooth optimization for large instances. Iteratively Reweighted Least Squares (IRLS) is a widely used algorithm for this purpose due to its excellent numerical performance. However, while existing theory is able to guarantee convergence of this algorithm to the minimizer, it does not provide a global convergence rate. In this paper, we prove that a variant of IRLS converges with a global linear rate to a sparse solution, i.e., with a linear error decrease occurring immediately from any initialization, if the measurements fulfill the usual null space property assumption. We support our theory by numerical experiments showing that our linear rate captures the correct dimension dependence. We anticipate that our theoretical findings will lead to new insights for many other use cases of the IRLS algorithm, such as in low-rank matrix recovery.
引用
收藏
页数:14
相关论文
共 72 条
  • [11] Distributed optimization and statistical learning via the alternating direction method of multipliers
    Boyd S.
    Parikh N.
    Chu E.
    Peleato B.
    Eckstein J.
    [J]. Foundations and Trends in Machine Learning, 2010, 3 (01): : 1 - 122
  • [12] The gap between the null space property and the restricted isometry property
    Cahill, Jameson
    Chen, Xuemei
    Wang, Rongrong
    [J]. LINEAR ALGEBRA AND ITS APPLICATIONS, 2016, 501 : 363 - 375
  • [13] Near-optimal signal recovery from random projections: Universal encoding strategies?
    Candes, Emmanuel J.
    Tao, Terence
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5406 - 5425
  • [14] Stable signal recovery from incomplete and inaccurate measurements
    Candes, Emmanuel J.
    Romberg, Justin K.
    Tao, Terence
    [J]. COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (08) : 1207 - 1223
  • [15] Enhancing Sparsity by Reweighted l1 Minimization
    Candes, Emmanuel J.
    Wakin, Michael B.
    Boyd, Stephen P.
    [J]. JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) : 877 - 905
  • [16] A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
    Chambolle, Antonin
    Pock, Thomas
    [J]. JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2011, 40 (01) : 120 - 145
  • [17] Iteratively reweighted algorithms for compressive sensing
    Chartrand, Rick
    Yin, Wotao
    [J]. 2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, : 3869 - +
  • [18] Chartrand R, 2016, SCI COMPUT, P237, DOI 10.1007/978-3-319-41589-5_7
  • [19] CHEN SB, 1994, CONF REC ASILOMAR C, P41, DOI 10.1109/ACSSC.1994.471413
  • [20] Chen SSB, 2001, SIAM REV, V43, P129, DOI [10.1137/S003614450037906X, 10.1137/S1064827596304010]