The Hastings algorithm at fifty

被引:22
作者
Dunson, D. B. [1 ]
Johndrow, J. E. [2 ]
机构
[1] Duke Univ, Dept Stat Sci, Box 90251, Durham, NC 27707 USA
[2] Univ Penn, Wharton Sch, Dept Stat, 3730 Walnut St, Philadelphia, PA 19104 USA
基金
美国国家科学基金会;
关键词
Bayesian computation; Importance sampling; Markov chain Monte Carlo; Metropolis-Hastings; Posterior sampling; Rejection sampling; CHAIN MONTE-CARLO; BAYESIAN-ANALYSIS; POLYNOMIAL ERGODICITY; MARKOV-CHAINS; CONVERGENCE; GIBBS; POSTERIOR; PARALLEL; SAMPLER; MODELS;
D O I
10.1093/biomet/asz066
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
In a 1970 Biometrika paper, W. K. Hastings developed a broad class of Markov chain algorithms for sampling from probability distributions that are difficult to sample from directly. The algorithm draws a candidate value from a proposal distribution and accepts the candidate with a probability that can be computed using only the unnormalized density of the target distribution, allowing one to sample from distributions known only up to a constant of proportionality. The stationary distribution of the corresponding Markov chain is the target distribution one is attempting to sample from. The Hastings algorithm generalizes the Metropolis algorithm to allow a much broader class of proposal distributions instead of just symmetric cases. An important class of applications for the Hastings algorithm corresponds to sampling from Bayesian posterior distributions, which have densities given by a prior density multiplied by a likelihood function and divided by a normalizing constant equal to the marginal likelihood. The marginal likelihood is typically intractable, presenting a fundamental barrier to implementation in Bayesian statistics. This barrier can be overcome by Markov chain Monte Carlo sampling algorithms. Amazingly, even after 50 years, the majority of algorithms used in practice today involve the Hastings algorithm. This article provides a brief celebration of the continuing impact of this ingenious algorithm on the 50th anniversary of its publication.
引用
收藏
页码:1 / 23
页数:23
相关论文
共 155 条
  • [61] Dalalyan A. S., 2017, STOCHASTIC PROCESSES, V129, P5278
  • [62] Programming With Models: Writing Statistical Algorithms for General Model Structures With NIMBLE
    de Valpine, Perry
    Turek, Daniel
    Paciorek, Christopher J.
    Anderson-Bergman, Clifford
    Lang, Duncan Temple
    Bodik, Rastislav
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2017, 26 (02) : 403 - 413
  • [63] Sequential Monte Carlo samplers
    Del Moral, Pierre
    Doucet, Arnaud
    Jasra, Ajay
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2006, 68 : 411 - 436
  • [64] EXPONENTIAL ERGODICITY OF THE BOUNCY PARTICLE SAMPLER
    Deligiannidis, George
    Bouchard-Cote, Alexandre
    Doucet, Arnaud
    [J]. ANNALS OF STATISTICS, 2019, 47 (03) : 1268 - 1287
  • [65] What do we know about the metropolis algorithm?
    Diaconis, P
    Saloff-Coste, L
    [J]. JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1998, 57 (01) : 20 - 36
  • [66] The top 10 algorithms
    Dongarra, J
    Sullivan, F
    [J]. COMPUTING IN SCIENCE & ENGINEERING, 2000, 2 (01) : 22 - 23
  • [67] Practical drift conditions for subgeometric rates of convergence
    Douc, R
    Fort, G
    Moulines, E
    Soulier, P
    [J]. ANNALS OF APPLIED PROBABILITY, 2004, 14 (03) : 1353 - 1377
  • [68] HYBRID MONTE-CARLO
    DUANE, S
    KENNEDY, AD
    PENDLETON, BJ
    ROWETH, D
    [J]. PHYSICS LETTERS B, 1987, 195 (02) : 216 - 222
  • [69] Dubey Avinava, 2016, Adv Neural Inf Process Syst, V29, P1154
  • [70] Approximate Bayesian inference for quantiles
    Dunson, DB
    Taylor, JA
    [J]. JOURNAL OF NONPARAMETRIC STATISTICS, 2005, 17 (03) : 385 - 400