CONDITIONAL SEQUENTIAL MONTE CARLO IN HIGH DIMENSIONS

被引:0
作者
Finke, Axel [1 ]
Thiery, Alexandre H. [1 ,2 ]
机构
[1] Loughborough Univ, Dept Math Sci, Loughborough, England
[2] Natl Univ Singapore, Dept Stat & Appl Probabil, Singapore, Singapore
关键词
High dimensions; curse of dimension; Markov chain Monte Carlo; particle filter; state-space model; RANDOM-WALK METROPOLIS; UNIFORM ERGODICITY; SAMPLING METHODS; PARTICLE; ALGORITHMS;
D O I
10.1214/22-AOS2252
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The iterated conditional sequential Monte Carlo (i-CSMC) algorithm from Andrieu, Doucet and Holenstein (J. R. Stat. Soc. Ser. B Stat. Methodol. 72 (2010) 269-342) is an MCMC approach for efficiently sampling from the joint posterior distribution of the T latent states in challenging time-series models, for example, in nonlinear or non-Gaussian state-space models. It is also the main ingredient in particle Gibbs samplers which infer unknown model parameters alongside the latent states. In this work, we first prove that the i-CSMC algorithm suffers from a curse of dimension in the dimension of the states, D: it breaks down unless the number of samples ('particles'), N, proposed by the algorithm grows exponentially with D. Then we present a novel 'local' version of the algorithm which proposes particles using Gaus-sian random-walk moves that are suitably scaled with D. We prove that this it-erated random-walk conditional sequential Monte Carlo (i-RW-CSMC) algo-rithm avoids the curse of dimension: for arbitrary N, its acceptance rates and expected squared jumping distance converge to nontrivial limits as D & RARR; & INFIN;. If T = N = 1, our proposed algorithm reduces to a Metropolis-Hastings or Barker's algorithm with Gaussian random-walk moves and we recover the well-known scaling limits for such algorithms.
引用
收藏
页码:437 / 463
页数:27
相关论文
共 51 条
  • [1] AGRAWAL S., 2021, ARXIV, DOI DOI 10.48550/ARXIV.2104.02020
  • [2] Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers
    Andrieu, Christophe
    Lee, Anthony
    Vihola, Matti
    [J]. BERNOULLI, 2018, 24 (02) : 842 - 872
  • [3] Particle Markov chain Monte Carlo methods
    Andrieu, Christophe
    Doucet, Arnaud
    Holenstein, Roman
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2010, 72 : 269 - 342
  • [4] MONTE CARLO CALCULATIONS OF RADIAL DISTRIBUTION FUNCTIONS FOR A PROTON-ELECTRON PLASMA
    BARKER, AA
    [J]. AUSTRALIAN JOURNAL OF PHYSICS, 1965, 18 (02): : 119 - &
  • [5] On the empirical efficiency of local MCMC algorithms with pools of proposals
    Bedard, Mylene
    Mireuta, Matei
    [J]. CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2013, 41 (04): : 657 - 678
  • [6] Scaling analysis of multiple-try MCMC methods
    Bedard, Mylene
    Douc, Randal
    Moulines, Eric
    [J]. STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2012, 122 (03) : 758 - 786
  • [7] Simple conditions for convergence of sequential Monte Carlo genealogies with applications
    Brown, Suzie
    Jenkins, Paul A.
    Johansen, Adam M.
    Koskela, Jere
    [J]. ELECTRONIC JOURNAL OF PROBABILITY, 2021, 26 : 1 - 22
  • [8] CAPPE O., 2005, SPR S STAT
  • [9] On particle Gibbs sampling
    Chopin, Nicolas
    Singh, Sumeetpal S.
    [J]. BERNOULLI, 2015, 21 (03) : 1855 - 1883
  • [10] Markov chain Monte Carlo using an approximation
    Christen, JA
    Fox, C
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2005, 14 (04) : 795 - 810