A FIRST-ORDER SMOOTHED PENALTY METHOD FOR COMPRESSED SENSING

被引:23
作者
Aybat, N. S. [1 ]
Iyengar, G. [1 ]
机构
[1] Columbia Univ, Dept Ind Engn & Operat Res, New York, NY 10027 USA
基金
美国国家科学基金会;
关键词
l(1)-minimization; basis pursuit; compressed sensing; sparse optimization; smooth approximation of nonsmooth functions; Nesterov's method; ROBUST UNCERTAINTY PRINCIPLES; LINEAR INVERSE PROBLEMS; RECONSTRUCTION; L(1)-MINIMIZATION; CONTINUATION; ALGORITHM;
D O I
10.1137/090762294
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{parallel to x parallel to(1) : Ax = b}. SPA is efficient as long as the matrix-vector product Ax and A(T) y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates x(k) are epsilon-feasible; i.e. parallel to Ax(k) - b parallel to(2) <= epsilon and epsilon-optimal; i.e. vertical bar parallel to x(psi)parallel to 1 vertical bar <= epsilon after (O) over tilde (epsilon(-3/2)) iterations. SPA is able to work with l(1), l(2), or l(infinity) penalty on the infeasibility, and SPA can be easily extended to solve the relaxed recovery problem min{parallel to x parallel to(1) : parallel to Ax - b parallel to(2) <= delta}.
引用
收藏
页码:287 / 313
页数:27
相关论文
共 20 条
[1]  
AYBAT NS, 2009, SIAM J IMAGING UNPUB
[2]  
AYBAT NS, 2009, SIAM J OPTIM UNPUB
[3]   NESTA: A Fast and Accurate First-Order Method for Sparse Recovery [J].
Becker, Stephen ;
Bobin, Jerome ;
Candes, Emmanuel J. .
SIAM JOURNAL ON IMAGING SCIENCES, 2011, 4 (01) :1-39
[4]   Robust uncertainty principles:: Exact signal reconstruction from highly incomplete frequency information [J].
Candès, EJ ;
Romberg, J ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (02) :489-509
[5]   Quantitative robust uncertainty principles and optimally sparse decompositions [J].
Candès, Emmanuel J. ;
Romberg, Justin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2006, 6 (02) :227-254
[6]   Near-optimal signal recovery from random projections: Universal encoding strategies? [J].
Candes, Emmanuel J. ;
Tao, Terence .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) :5406-5425
[7]   An iterative thresholding algorithm for linear inverse problems with a sparsity constraint [J].
Daubechies, I ;
Defrise, M ;
De Mol, C .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2004, 57 (11) :1413-1457
[8]   Accelerated Projected Gradient Method for Linear Inverse Problems with Sparsity Constraints [J].
Daubechies, Ingrid ;
Fornasier, Massimo ;
Loris, Ignace .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) :764-792
[9]   Compressed sensing [J].
Donoho, DL .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (04) :1289-1306
[10]   Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems [J].
Figueiredo, Mario A. T. ;
Nowak, Robert D. ;
Wright, Stephen J. .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2007, 1 (04) :586-597