Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo

被引:13
作者
Vihola, Matti [1 ]
Helske, Jouni [1 ,2 ]
Franks, Jordan [1 ,3 ]
机构
[1] Univ Jyvaskyla, Dept Math & Stat, POB 35, FI-40014 Jyvaskyla, Finland
[2] Linkoping Univ, Dept Sci & Technol, Linkoping, Sweden
[3] Newcastle Univ, Sch Math Stat & Phys, Newcastle Upon Tyne, Tyne & Wear, England
基金
芬兰科学院;
关键词
Delayed acceptance; importance sampling; Markov chain Monte Carlo; sequential Monte Carlo; pseudo-marginal method; unbiased estimator; CENTRAL LIMIT-THEOREMS; GEOMETRIC ERGODICITY; ADDITIVE-FUNCTIONALS; BAYESIAN COMPUTATION; UNIFORM ERGODICITY; STATE; GIBBS; SIMULATION; CONVERGENCE; HASTINGS;
D O I
10.1111/sjos.12492
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelization and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the suggested estimators, and provide central limit theorems with expressions for asymptotic variances. We demonstrate how our method can make use of SMC in the state space models context, using Laplace approximations and time-discretized diffusions. Our experimental results are promising and show that the IS-type approach can provide substantial gains relative to an analogous DA scheme, and is often competitive even without parallelization.
引用
收藏
页码:1339 / 1376
页数:38
相关论文
共 50 条
[21]   Multivariate initial sequence estimators in Markov chain Monte Carlo [J].
Dai, Ning ;
Jones, Galin L. .
JOURNAL OF MULTIVARIATE ANALYSIS, 2017, 159 :184-199
[22]   Unbiased Markov chain Monte Carlo methods with couplings [J].
Jacob, Pierre E. ;
O'Leary, John ;
Atchade, Yves F. .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2020, 82 (03) :543-600
[23]   Control Variates for Estimation Based on Reversible Markov Chain Monte Carlo Samplers [J].
Dellaportas, Petros ;
Kontoyiannis, Ioannis .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2012, 74 (01) :133-161
[24]   Control variates for estimation based on reversible Markov chain Monte Carlo samplers [J].
Dellaportas, Petros ;
Kontoyiannis, Ioannis .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2012, 74 :133-161
[25]   GPU Acceleration for Markov Chain Monte Carlo Sampling [J].
Li, Jiarui ;
Landry, Samuel ;
Mettu, Ramgopal R. .
PROCEEDINGS OF 4TH INTERNATIONAL CONFERENCE ON AI-ML SYSTEMS 2024, 2024,
[26]   Markov Chain Monte Carlo sampling on multilocus genotypes [J].
Szydlowski, M. .
JOURNAL OF ANIMAL AND FEED SCIENCES, 2006, 15 (04) :685-694
[27]   Automated Parameter Blocking for Efficient Markov Chain Monte Carlo Sampling [J].
Turek, Daniel ;
de Valpine, Perry ;
Paciorek, Christopher J. ;
Anderson-Bergman, Clifford .
BAYESIAN ANALYSIS, 2017, 12 (02) :465-490
[28]   Convergence Diagnostics for Markov Chain Monte Carlo [J].
Roy, Vivekananda .
ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 7, 2020, 2020, 7 :387-412
[29]   An introduction to Markov chain Monte Carlo methods [J].
Besag, J .
MATHEMATICAL FOUNDATIONS OF SPEECH AND LANGUAGE PROCESSING, 2004, 138 :247-270
[30]   Stochastic image denoising based on Markov-chain Monte Carlo sampling [J].
Wong, Alexander ;
Mishra, Akshaya ;
Zhang, Wen ;
Fieguth, Paul ;
Clausi, David A. .
SIGNAL PROCESSING, 2011, 91 (08) :2112-2120