An introduction to Markov chain Monte Carlo methods

被引:0
|
作者
Besag, J [1 ]
机构
[1] Univ Washington, Dept Stat, Seattle, WA 98195 USA
关键词
autologistic distribution; Bayesian computation; Gibbs sampler; Hastings algorithm; hidden Markov models; importance sampling; Ising model; Markov chain Monte Carlo; Markov random fields; maximum likelihood estimation; Metropolis method; noisy binary channel; perfect simulation; reversibility; simulated annealing;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
This article provides an introduction to Markov chain Monte Carlo methods in statistical inference. Over the past twelve years or so, these have revolutionized what can be achieved computationally, especially in the Bayesian paradigm. Markov chain Monte Carlo has exactly the same goals as ordinary Monte Carlo and both are intended to exploit the fact that one can learn about a complex probability distribution if one can sample from it. Although the ordinary version can only rarely be implemented, it is convenient initially to presume otherwise and to focus on the rationale of the sampling approach, rather than computational details. The article then moves on to describe implementation via Markov chains, especially the Hastings algorithm, including the Metropolis method and the Gibbs sampler as special cases. Hidden Markov models and the autologistic distribution receive some emphasis, with the noisy binary channel used in some toy examples. A brief description of perfect simulation is also given. The account concludes with some discussion.
引用
收藏
页码:247 / 270
页数:24
相关论文
共 50 条