Hierachical Bayesian models and sparsity: l(2)-magic

被引:42
作者
Calvetti, D. [1 ]
Somersalo, E. [1 ]
Strang, A. [1 ]
机构
[1] Case Western Reserve Univ, Dept Math Appl Math & Stat, Cleveland, OH 44106 USA
关键词
convergence rate; sensitivity weighting; Bayesian hypermodel; compressive sensing; SOURCE LOCALIZATION; 3-D INVERSION; RECOVERY;
D O I
10.1088/1361-6420/aaf5ab
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Sparse recovery seeks to estimate the support and the non-zero entries of a sparse signal x is an element of R-n from possibly incomplete noisy observations y = Ax(0) +epsilon, with A is an element of R-mxn, m <= n. It has been shown that under various restrictive conditions on the matrix A, the problem can be reduced to the l(1) regularized problem min parallel to x parallel to(1) subject to parallel to Ax - y parallel to(2) < delta, where delta is the size of the error epsilon, and the approximation error is well controlled by delta. A popular method for solving the above minimization problem is the iteratively reweighted least squares algorithm. Here we reformulate the question of sparse recovery as an inverse problem in the Bayesian framework, express the sparsity belief by means of a hierachical prior model and show that the maximum a posteriori (MAP) solution computed by a recently proposed iterative alternating sequential (IAS) algorithm, requiring only the solution of linear systems in the least squares sense, converges linearly to the unique minimum for any matrix A, and quadratically on the complement of the support of the minimizer. The values of the parameters of the hierarchical model are assigned from an estimate of the signal to noise ratio and a priori belief of the degree of sparsity of the underlying signal, and automatically take into account the sensitivity of the data to the different components of x. The approach gives a solid Bayesian interpretation for the commonly used sensitivity weighting in geophysics and biomedical applications. Moreover, since for a suitable choice of sequences of parameters of the hyperprior, the IAS solution converges to the l(1) regularized solution, the Bayesian framework for inverse problems makes the l(1)-magic happen in the l(2) framework.
引用
收藏
页数:26
相关论文
共 20 条
[1]  
[Anonymous], SOC IND APPL MATH
[2]   A hierarchical Krylov-Bayes iterative inverse solver for MEG with physiological preconditioning [J].
Calvetti, D. ;
Pascarella, A. ;
Pitolli, F. ;
Somersalo, E. ;
Vantaggi, B. .
INVERSE PROBLEMS, 2015, 31 (12)
[3]  
Calvetti D., 2007, An introduction to Bayesian scientific computing: ten lectures on subjective computing
[4]   A Gaussian hypermodel to recover blocky objects [J].
Calvetti, Daniela ;
Somersalo, Erkki .
INVERSE PROBLEMS, 2007, 23 (02) :733-754
[5]   Brain Activity Mapping from MEG Data via a Hierarchical Bayesian Algorithm with Automatic Depth Weighting [J].
Calvetti, Daniela ;
Pascarella, Annalisa ;
Pitolli, Francesca ;
Somersalo, Erkki ;
Vantaggi, Barbara .
BRAIN TOPOGRAPHY, 2019, 32 (03) :363-393
[6]   Conditionally Gaussian Hypermodels for Cerebral Source Localization [J].
Calvetti, Daniela ;
Hakula, Harri ;
Pursiainen, Sampsa ;
Somersalo, Erkki .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (03) :879-909
[7]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[8]   Stable signal recovery from incomplete and inaccurate measurements [J].
Candes, Emmanuel J. ;
Romberg, Justin K. ;
Tao, Terence .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (08) :1207-1223
[9]   Iteratively Reweighted Least Squares Minimization for Sparse Recovery [J].
Daubechies, Ingrid ;
Devore, Ronald ;
Fornasier, Massimo ;
Guentuerk, C. Sinan .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2010, 63 (01) :1-38
[10]   For most large underdetermined systems of equations, the minimal l1-norm near-solution approximates the sparsest near-solution [J].
Donoho, David L. .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (07) :907-934