Honest exploration of intractable probability distributions via Markov chain Monte Carlo

被引:164
|
作者
Jones, GL
Hobert, JP
机构
[1] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
[2] Univ Florida, Dept Stat, Gainesville, FL 32611 USA
关键词
central limit theorem; convergence rate; coupling inequality; drift condition; general state space; geometric ergodicity; Gibbs sampler; hierarchical random effects model; Metropolis algorithm; minorization condition; regeneration; splitting; uniform ergodicity;
D O I
10.1214/ss/1015346317
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Two important questions that must be answered whenever a Markov chain Monte Carlo (MCMC) algorithm is used are (Q1) What is an appropriate burn-in? and (Q2) How long should the sampling continue after burn-in? Developing rigorous answers to these questions presently requires a detailed study of the convergence properties of the underlying Markov chain. Consequently, in most practical applications of MCMC, exact answers to (Q1) and (Q2) are not sought. The goal of this paper is to demystify the analysis that leads to honest answers to (Q1) and (Q2). The authors hope that this article will serve as a bridge between those developing Markov chain theory and practitioners using MCMC to solve practical problems. The ability to address (Q1) and (Q2) formally comes from establishing a drift condition and an associated minorization condition, which together imply that the underlying Markov chain is geometrically ergodic. In this article, we explain exactly what drift and minorization are as well as how and why these conditions can be used to form rigorous answers to (Q1) and (Q2). The basic ideas are as follows. The results of Rosenthal (1995) and Roberts and Tweedie (1999) allow one to use drift and minorization conditions to construct a formula giving an analytic upper bound on the distance to stationarity, A rigorous answer to (Q1) can be calculated using this formula. The desired characteristics of the target distribution are typically estimated using ergodic averages. Geometric ergodicity of the underlying Markov chain implies that there are central limit theorems available for ergodic averages (Chan and Geyer 1994). The regenerative simulation technique (Mykland, Tierney and Yu, 1995; Robert, 1995) can be used to get a consistent estimate of the variance of the asymptotic normal distribution. Hence, an asymptotic standard error can be calculated, which provides an answer to (Q2) in the sense that an appropriate time to stop sampling can be determined. The methods are illustrated using a Gibbs sampler for a Bayesian version of the one-way random effects model and a data set concerning styrene exposure.
引用
收藏
页码:312 / 334
页数:23
相关论文
共 50 条
  • [31] Multilevel Markov Chain Monte Carlo
    Dodwell, T. J.
    Ketelsen, C.
    Scheichl, R.
    Teckentrup, A. L.
    SIAM REVIEW, 2019, 61 (03) : 509 - 545
  • [32] THE MARKOV CHAIN MONTE CARLO REVOLUTION
    Diaconis, Persi
    BULLETIN OF THE AMERICAN MATHEMATICAL SOCIETY, 2009, 46 (02) : 179 - 205
  • [33] MARKOV CHAIN MONTE CARLO AND IRREVERSIBILITY
    Ottobre, Michela
    REPORTS ON MATHEMATICAL PHYSICS, 2016, 77 (03) : 267 - 292
  • [34] STEREOGRAPHIC MARKOV CHAIN MONTE CARLO
    Yang, Jun
    Latuszynski, Krzysztof
    Roberts, Gareth o.
    ANNALS OF STATISTICS, 2024, 52 (06): : 2692 - 2713
  • [35] Noisy Hamiltonian Monte Carlo for Doubly Intractable Distributions
    Stoehr, Julien
    Benson, Alan
    Friel, Nial
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2019, 28 (01) : 220 - 232
  • [36] Efficient Visual Tracking via Hamiltonian Monte Carlo Markov Chain
    Wang, Fasheng
    Lu, Mingyu
    COMPUTER JOURNAL, 2013, 56 (09): : 1102 - 1112
  • [37] Fast Markov chain Monte Carlo algorithms via Lie groups
    Huntsman, Steve
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2841 - 2850
  • [38] Soft Evidential Update via Markov Chain Monte Carlo Inference
    Jain, Dominik
    Beetz, Michael
    KI 2010: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2010, 6359 : 280 - 290
  • [39] Bayesian Trend Filtering via Proximal Markov Chain Monte Carlo
    Heng, Qiang
    Zhou, Hua
    Chi, Eric C.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2023, 32 (03) : 938 - 949
  • [40] Image Registration via Stochastic Gradient Markov Chain Monte Carlo
    Grzech, Daniel
    Kainz, Bernhard
    Glocker, Ben
    Le Folgoc, Loic
    UNCERTAINTY FOR SAFE UTILIZATION OF MACHINE LEARNING IN MEDICAL IMAGING, AND GRAPHS IN BIOMEDICAL IMAGE ANALYSIS, UNSURE 2020, GRAIL 2020, 2020, 12443 : 3 - 12