This article provides an introduction to Markov chain Monte Carlo methods in statistical inference. Over the past twelve years or so, these have revolutionized what can be achieved computationally, especially in the Bayesian paradigm. Markov chain Monte Carlo has exactly the same goals as ordinary Monte Carlo and both are intended to exploit the fact that one can learn about a complex probability distribution if one can sample from it. Although the ordinary version can only rarely be implemented, it is convenient initially to presume otherwise and to focus on the rationale of the sampling approach, rather than computational details. The article then moves on to describe implementation via Markov chains, especially the Hastings algorithm, including the Metropolis method and the Gibbs sampler as special cases. Hidden Markov models and the autologistic distribution receive some emphasis, with the noisy binary channel used in some toy examples. A brief description of perfect simulation is also given. The account concludes with some discussion.
机构:
Univ Illinois, Dept Polit Sci Stat Math Asian Amer Studies, Urbana, IL 61801 USA
Univ Illinois, Coll Law, Urbana, IL 61801 USA
Univ Illinois, Natl Ctr Supercomp Applicat, Urbana, IL 61801 USAUniv Illinois, Dept Polit Sci Stat Math Asian Amer Studies, Urbana, IL 61801 USA
Cho, Wendy K. Tam
Liu, Yan Y.
论文数: 0引用数: 0
h-index: 0
机构:
Univ Illinois, Natl Ctr Supercomp Applicat, Urbana, IL 61801 USA
Univ Illinois, Dept Geog & Geog Informat Sci, Urbana, IL USAUniv Illinois, Dept Polit Sci Stat Math Asian Amer Studies, Urbana, IL 61801 USA