SQUARE ROOT LASSO: WELL-POSEDNESS, LIPSCHITZ STABILITY, AND THE TUNING TRADE-OFF

被引:0
作者
Berk, Aaron [1 ]
Brugiapaglia, Simone [2 ]
Hoheisel, Tim [1 ]
机构
[1] McGill Univ, Dept Math & Stat, Montreal, PQ H3A 0G4, Canada
[2] Concordia Univ, Dept Math & Stat, Montreal, PQ H3G 1M8, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
square root LASSO; sparse recovery; variational analysis; convex analysis; sensitiv- ity analysis; implicit function theorem; Lipschitz stability; SOLUTION UNIQUENESS;
D O I
10.1137/23M1561968
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper studies well-posedness and parameter sensitivity of the square root LASSO (SR-LASSO), an optimization model for recovering sparse solutions to linear inverse problems in finite dimension. An advantage of the SR-LASSO (e.g., over the standard LASSO) is that the optimal tuning of the regularization parameter is robust with respect to measurement noise. This paper provides three point-based regularity conditions at a solution of the SR-LASSO: the weak, intermediate, and strong assumptions. It is shown that the weak assumption implies uniqueness of the solution in question. The intermediate assumption yields a directionally differentiable and locally Lipschitz solution map (with explicit Lipschitz bounds), whereas the strong assumption gives continuous differentiability of said map around the point in question. Our analysis leads to new theoretical insights on the comparison between SR-LASSO and LASSO from the viewpoint of tuning parameter sensitivity: noise-robust optimal parameter choice for SR-LASSO comes at the ``price"" of elevated tuning parameter sensitivity. Numerical results support and showcase the theoretical findings.
引用
收藏
页码:2609 / 2637
页数:29
相关论文
共 50 条
[1]  
Adcock B, 2021, Compressive Imaging: Structure, Sampling, Learning
[2]  
ADCOCK B., 2022, Sparse Polynomial Approximation of High-Dimensional Functions, V25
[3]  
Adcock B, 2021, PR MACH LEARN RES, V145, P1
[4]   The Gap between Theory and Practice in Function Approximation with Deep Neural Networks [J].
Adcock, Ben ;
Dexter, Nick .
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2021, 3 (02) :624-655
[5]   Do Log Factors Matter? On Optimal Wavelet Approximation and the Foundations of Compressed Sensing [J].
Adcock, Ben ;
Brugiapaglia, Simone ;
King-Roskamp, Matthew .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2022, 22 (01) :99-159
[6]   Correcting for unknown errors in sparse high-dimensional function approximation [J].
Adcock, Ben ;
Bao, Anyi ;
Brugiapaglia, Simone .
NUMERISCHE MATHEMATIK, 2019, 142 (03) :667-711
[7]  
Agrawal Akshay, 2018, Journal of Control and Decision, V5, P42, DOI 10.1080/23307706.2017.1397554
[8]  
[Anonymous], 2019, MOSEK Optimizer API for Python 9.3.21
[9]  
BABU P., 2012, Ph.D. thesis
[10]   Connection between SPICE and Square-Root LASSO for sparse parameter estimation [J].
Babu, Prabhu ;
Stoica, Petre .
SIGNAL PROCESSING, 2014, 95 :10-14