Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo

被引:13
作者
Vihola, Matti [1 ]
Helske, Jouni [1 ,2 ]
Franks, Jordan [1 ,3 ]
机构
[1] Univ Jyvaskyla, Dept Math & Stat, POB 35, FI-40014 Jyvaskyla, Finland
[2] Linkoping Univ, Dept Sci & Technol, Linkoping, Sweden
[3] Newcastle Univ, Sch Math Stat & Phys, Newcastle Upon Tyne, Tyne & Wear, England
基金
芬兰科学院;
关键词
Delayed acceptance; importance sampling; Markov chain Monte Carlo; sequential Monte Carlo; pseudo-marginal method; unbiased estimator; CENTRAL LIMIT-THEOREMS; GEOMETRIC ERGODICITY; ADDITIVE-FUNCTIONALS; BAYESIAN COMPUTATION; UNIFORM ERGODICITY; STATE; GIBBS; SIMULATION; CONVERGENCE; HASTINGS;
D O I
10.1111/sjos.12492
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelization and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the suggested estimators, and provide central limit theorems with expressions for asymptotic variances. We demonstrate how our method can make use of SMC in the state space models context, using Laplace approximations and time-discretized diffusions. Our experimental results are promising and show that the IS-type approach can provide substantial gains relative to an analogous DA scheme, and is often competitive even without parallelization.
引用
收藏
页码:1339 / 1376
页数:38
相关论文
共 50 条
[41]   Monte Carlo integration with Markov chain [J].
Tan, Zhiqiang .
JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2008, 138 (07) :1967-1980
[42]   On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction [J].
Vihola, Matti ;
Franks, Jordan .
BIOMETRIKA, 2020, 107 (02) :381-395
[43]   Regenerative Markov Chain Monte Carlo for Any Distribution [J].
Minh, Do Le ;
Minh, David D. L. ;
Nguyen, Andrew L. .
COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2012, 41 (09) :1745-1760
[44]   Bayesian Computation Via Markov Chain Monte Carlo [J].
Craiu, Radu V. ;
Rosenthal, Jeffrey S. .
ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 1, 2014, 1 :179-201
[45]   Multivariate output analysis for Markov chain Monte Carlo [J].
Vats, Dootika ;
Flegal, James M. ;
Jones, Galin L. .
BIOMETRIKA, 2019, 106 (02) :321-337
[46]   ERGODICITY OF MARKOV CHAIN MONTE CARLO WITH REVERSIBLE PROPOSAL [J].
Kamatani, K. .
JOURNAL OF APPLIED PROBABILITY, 2017, 54 (02) :638-654
[47]   Explicit error bounds for Markov chain Monte Carlo [J].
Rudolf, D. .
DISSERTATIONES MATHEMATICAE, 2012, (485) :5-+
[48]   Strong consistency of multivariate spectral variance estimators in Markov chain Monte Carlo [J].
Vats, Dootika ;
Flegal, James M. ;
Jones, Galin L. .
BERNOULLI, 2018, 24 (03) :1860-1909
[49]   Sampling from complicated and unknown distributions Monte Carlo and Markov Chain Monte Carlo methods for redistricting [J].
Cho, Wendy K. Tam ;
Liu, Yan Y. .
PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 506 :170-178
[50]   A mixture representation of π with applications in Markov chain Monte Carlo and perfect sampling [J].
Hobert, JP ;
Robert, CP .
ANNALS OF APPLIED PROBABILITY, 2004, 14 (03) :1295-1305