Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms

被引:72
作者
Ba, Demba [1 ,2 ]
Babadi, Behtash [1 ,2 ]
Purdon, Patrick L. [1 ,2 ]
Brown, Emery N. [1 ,2 ,3 ]
机构
[1] MIT, Dept Brain & Cognit Sci, Cambridge, MA 02139 USA
[2] Massachusetts Gen Hosp, Dept Anesthesia Crit Care & Pain Med, Boston, MA 02114 USA
[3] MIT, Harvard MIT Div Hlth Sci & Technol, Cambridge, MA 02139 USA
基金
美国国家卫生研究院;
关键词
Compressive sampling; constrained maximum likelihood estimation; Gaussian scale mixtures; expectation-maximization algorithms; mathematical programming; STABLE RECOVERY; SPARSE; SIGNALS;
D O I
10.1109/TSP.2013.2287685
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we study the theoretical properties of iteratively re-weighted least squares (IRLS) algorithms and their utility in sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between the IRLS algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS algorithms minimize smooth versions of the l(nu) 'norms', for 0 < nu <= 1. We leverage EM theory to show that the limit points of the sequence of IRLS iterates are stationary points of the smooth "norm" minimization problem on the constraint set. We employ techniques from Compressive Sampling (CS) theory to show that the IRLS algorithm is stable, if the limit point of the iterates coincides with the global minimizer. We further characterize the convergence rate of the IRLS algorithm, which implies global linear convergence for nu = 1 and local super-linear convergence for 0 < nu < 1. We demonstrate our results via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.
引用
收藏
页码:183 / 195
页数:13
相关论文
共 33 条
[1]  
ANDREWS DF, 1974, J ROY STAT SOC B MET, V36, P99
[2]  
[Anonymous], 1969, Nonlinear programming: a unified approach
[3]  
[Anonymous], 1999, Athena scientific Belmont
[4]  
[Anonymous], 2004, Optimization
[5]  
[Anonymous], 2011, CVX MATLAB SOFTWARE
[6]   SPARLS: The Sparse RLS Algorithm [J].
Babadi, Behtash ;
Kalouptsidis, Nicholas ;
Tarokh, Vahid .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (08) :4013-4025
[7]   Compressed Channel Sensing: A New Approach to Estimating Sparse Multipath Channels [J].
Bajwa, Waheed U. ;
Haupt, Jarvis ;
Sayeed, Akbar M. ;
Nowak, Robert .
PROCEEDINGS OF THE IEEE, 2010, 98 (06) :1058-1076
[8]  
Bertsekas D., 2003, Convex Analysis and Optimization
[9]   From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images [J].
Bruckstein, Alfred M. ;
Donoho, David L. ;
Elad, Michael .
SIAM REVIEW, 2009, 51 (01) :34-81
[10]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215