Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors

被引:19
作者
Lucka, Felix [1 ,2 ]
机构
[1] Univ Munster, Inst Computat & Appl Math, D-48149 Munster, Germany
[2] Univ Munster, Inst Biomagnetism & Biosignalanal, D-48149 Munster, Germany
关键词
X-RAY TOMOGRAPHY; STATISTICAL INVERSION; DISTRIBUTIONS; RADIOGRAPHS;
D O I
10.1088/0266-5611/28/12/125012
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Sparsity has become a key concept for solving of high-dimensional inverse problems using variational regularization techniques. Recently, using similar sparsity-constraints in the Bayesian framework for inverse problems by encoding them in the prior distribution has attracted attention. Important questions about the relation between regularization theory and Bayesian inference still need to be addressed when using sparsity promoting inversion. A practical obstacle for these examinations is the lack of fast posterior sampling algorithms for sparse, high-dimensional Bayesian inversion. Accessing the full range of Bayesian inference methods requires being able to draw samples from the posterior probability distribution in a fast and efficient way. This is usually done using Markov chain Monte Carlo (MCMC) sampling algorithms. In this paper, we develop and examine a new implementation of a single component Gibbs MCMC sampler for sparse priors relying on L1-norms. We demonstrate that the efficiency of our Gibbs sampler increases when the level of sparsity or the dimension of the unknowns is increased. This property is contrary to the properties of the most commonly applied Metropolis-Hastings (MH) sampling schemes. We demonstrate that the efficiency of MH schemes for L1-type priors dramatically decreases when the level of sparsity or the dimension of the unknowns is increased. Practically, Bayesian inversion for L1-type priors using MH samplers is not feasible at all. As this is commonly believed to be an intrinsic feature of MCMC sampling, the performance of our Gibbs sampler also challenges common beliefs about the applicability of sample based Bayesian inference.
引用
收藏
页数:31
相关论文
共 51 条
[1]   A General Purpose Sampling Algorithm for Continuous Distributions (the t-walk) [J].
Andres Christen, J. ;
Fox, Colin .
BAYESIAN ANALYSIS, 2010, 5 (02) :263-281
[2]  
[Anonymous], 2012, Compressed Sensing: Theory and Applications
[3]  
[Anonymous], 2008, MONTE CARLO STRATEGI
[4]  
[Anonymous], 2003, ITERATIVE METHODS SP, DOI DOI 10.1137/1.9780898718003
[5]   Hierarchical regularization for edge-preserving reconstruction of PET images [J].
Bardsley, Johnathan M. ;
Calvetti, Daniela ;
Somersalo, Erkki .
INVERSE PROBLEMS, 2010, 26 (03)
[6]   Convergence rates of general regularization methods for statistical inverse problems and applications [J].
Bissantz, N. ;
Hohage, T. ;
Munk, A. ;
Ruymgaart, F. .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 2007, 45 (06) :2610-2636
[7]   Convergence assessment techniques for Markov chain Monte Carlo [J].
Brooks, SP ;
Roberts, GO .
STATISTICS AND COMPUTING, 1998, 8 (04) :319-335
[8]  
Burger M, 2011, 1108 CAM UCLA
[9]  
Burger M, 2012, GUIDE TV ZOO UNPUB
[10]   A Gaussian hypermodel to recover blocky objects [J].
Calvetti, Daniela ;
Somersalo, Erkki .
INVERSE PROBLEMS, 2007, 23 (02) :733-754