Computationally Efficient Sampling Methods for Sparsity Promoting Hierarchical Bayesian Models\ast

被引:3
作者
Calvetti, D. [1 ]
Somersalo, E. [1 ]
机构
[1] Case Western Reserve Univ, Dept Math Appl Math & Stat, Cleveland, OH 44106 USA
关键词
preconditioned Crank-Nicholson; Markov chain Monte Carlo; noncentered sampling; MAXIMUM-LIKELIHOOD-ESTIMATION; DIMENSIONAL INVERSE PROBLEMS; REGULARIZATION PARAMETERS; 3-D INVERSION; INFERENCE;
D O I
10.1137/23M1564043
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Bayesian hierarchical models have been demonstrated to provide efficient algorithms for finding sparse solutions to ill-posed inverse problems. The models comprise typically a conditionally Gaussian prior model for the unknown, augmented by a hyperprior model for the variances. A widely used choice for the hyperprior is a member of the family of generalized gamma distributions. Most of the work in the literature has concentrated on numerical approximation of the maximum a posteriori estimates, and less attention has been paid on sampling methods or other means for uncertainty quantification. Sampling from the hierarchical models is challenging mainly for two reasons: The hierarchical models are typically high dimensional, thus suffering from the curse of dimensionality, and the strong correlation between the unknown of interest and its variance can make sampling rather inefficient. This work addresses mainly the first one of these obstacles. By using a novel reparametrization, it is shown how the posterior distribution can be transformed into one dominated by a Gaussian white noise, allowing sampling by using the preconditioned Crank-Nicholson (pCN) scheme that has been shown to be efficient for sampling from distributions dominated by a Gaussian component. Furthermore, a novel idea for speeding up the pCN in a special case is developed, and the question of how strongly the hierarchical models are concentrated on sparse solutions is addressed in light of a computed example.
引用
收藏
页码:524 / 548
页数:25
相关论文
共 32 条
[1]   Analysis of the Gibbs Sampler for Hierarchical Inverse Problems [J].
Agapiou, Sergios ;
Bardsley, Johnathan M. ;
Papaspiliopoulos, Omiros ;
Stuart, Andrew M. .
SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2014, 2 (01) :511-544
[2]   A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors [J].
Agrawal, Shiv ;
Kim, Hwanwoo ;
Sanz-Alonso, Daniel ;
Strang, Alexander .
SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2022, 10 (04) :1533-1559
[3]   K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (11) :4311-4322
[4]  
Betancourt M., 2015, Current Trends in Bayesian Methodology with Applications, V79, P2, DOI [DOI 10.1201/B18502, 10.1201/b18502-5]
[5]   Hierachical Bayesian models and sparsity: l(2)-magic [J].
Calvetti, D. ;
Somersalo, E. ;
Strang, A. .
INVERSE PROBLEMS, 2019, 35 (03)
[6]  
Calvetti D., 2023, Bayesian Scientific Computing
[7]   SPARSITY PROMOTING HYBRID SOLVERS FOR HIERARCHICAL BAYESIAN INVERSE PROBLEMS [J].
Calvetti, Daniela ;
Pragliola, Monica ;
Somersalo, Erkki .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2020, 42 (06) :A3761-A3784
[8]   Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors [J].
Calvetti, Daniela ;
Pragliola, Monica ;
Somersalo, Erkki ;
Strang, Alexander .
INVERSE PROBLEMS, 2020, 36 (02)
[9]   Brain Activity Mapping from MEG Data via a Hierarchical Bayesian Algorithm with Automatic Depth Weighting [J].
Calvetti, Daniela ;
Pascarella, Annalisa ;
Pitolli, Francesca ;
Somersalo, Erkki ;
Vantaggi, Barbara .
BRAIN TOPOGRAPHY, 2019, 32 (03) :363-393
[10]   Hypermodels in the Bayesian imaging framework [J].
Calvetti, Daniela ;
Somersalo, Erkki .
INVERSE PROBLEMS, 2008, 24 (03)