Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-differentiable Priors

被引:5
作者
Goldman, Jacob Vorstrup [1 ]
Sell, Torben [2 ]
Singh, Sumeetpal Sidhu [1 ]
机构
[1] Univ Cambridge, Dept Engn, Signal Proc & Commun Lab, Cambridge, England
[2] Univ Cambridge, Dept Pure Math & Math Stat, Cambridge, England
基金
英国工程与自然科学研究理事会;
关键词
Bayesian imaging; Markov chain Monte Carlo; Markov processes; Proximal operators; Piece-wise deterministic; GENE SELECTION; ALGORITHMS; LANGEVIN; CLASSIFICATION; REGRESSION; CANCER;
D O I
10.1080/01621459.2021.1909600
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The use of nondifferentiable priors in Bayesian statistics has become increasingly popular, in particular in Bayesian imaging analysis. Current state-of-the-art methods are approximate in the sense that they replace the posterior with a smooth approximation via Moreau-Yosida envelopes, and apply gradient-based discretized diffusions to sample from the resulting distribution. We characterize the error of the Moreau-Yosida approximation and propose a novel implementation using underdamped Langevin dynamics. In misson-critical cases, however, replacing the posterior with an approximation may not be a viable option. Instead, we show that piecewise-deterministic Markov processes (PDMP) can be used for exact posterior inference from distributions satisfying almost everywhere differentiability. Furthermore, in contrast with diffusion-based methods, the suggested PDMP-based samplers place no assumptions on the prior shape, nor require access to a computationally cheap proximal operator, and consequently have a much broader scope of application. Through detailed numerical examples, including a nondifferentiable circular distribution and a nonconvex genomics model, we elucidate the relative strengths of these sampling methods on problems of moderate to high dimensions, underlining the benefits of PDMP-based methods when accurate sampling is decisive. for this article are available online.
引用
收藏
页码:2182 / 2193
页数:12
相关论文
共 58 条
[1]  
[Anonymous], 2013, P IEEE 21 EUR SIGN P
[2]   Sparse Bayesian Methods for Low-Rank Matrix Estimation [J].
Babacan, S. Derin ;
Luessi, Martin ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (08) :3964-3977
[3]   Bayesian Compressive Sensing Using Laplace Priors [J].
Babacan, S. Derin ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2010, 19 (01) :53-63
[4]   Variational Bayesian Blind Deconvolution Using a Total Variation Prior [J].
Babacan, S. Derin ;
Molina, Rafael ;
Katsaggelos, Aggelos K. .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2009, 18 (01) :12-26
[5]  
Bauschke HH, 2011, CMS BOOKS MATH, P1, DOI 10.1007/978-1-4419-9467-7
[6]   Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems [J].
Beck, Amir ;
Teboulle, Marc .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2009, 18 (11) :2419-2434
[7]   Dirichlet-Laplace Priors for Optimal Shrinkage [J].
Bhattacharya, Anirban ;
Pati, Debdeep ;
Pillai, Natesh S. ;
Dunson, David B. .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2015, 110 (512) :1479-1490
[8]   RANDOMIZED HAMILTONIAN MONTE CARLO [J].
Bou-Rabee, Nawaf ;
Maria Sanz-Serna, Jesus .
ANNALS OF APPLIED PROBABILITY, 2017, 27 (04) :2159-2194
[9]   The Bouncy Particle Sampler: A Nonreversible Rejection-Free Markov Chain Monte Carlo Method [J].
Bouchard-Cote, Alexandre ;
Vollmer, Sebastian J. ;
Doucet, Arnaud .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2018, 113 (522) :855-867
[10]  
Boyd S., 2004, CONVEX OPTIMIZATION, DOI 10.1017/CBO9780511804441