Divide-and-Conquer With Sequential Monte Carlo

被引:23
作者
Lindsten, F. [1 ]
Johansen, A. M. [2 ]
Naesseth, C. A. [3 ]
Kirkpatrick, B. [4 ]
Schon, T. B. [5 ]
Aston, J. A. D. [6 ]
Bouchard-Cote, A. [7 ]
机构
[1] Uppsala Univ, Dept Informat Technol, Div Syst & Control, Uppsala, Sweden
[2] Univ Warwick, Dept Stat, Coventry, W Midlands, England
[3] Linkoping Univ, Dept Elect Engn, Linkoping, Sweden
[4] Intrepid Net Comp, Dillon, MT USA
[5] Uppsala Univ, Dept Informat Technol, Uppsala, Sweden
[6] Univ Cambridge, Cambridge, England
[7] Univ British Columbia, Stat, Vancouver, BC, Canada
基金
加拿大自然科学与工程研究理事会; 英国工程与自然科学研究理事会; 瑞典研究理事会;
关键词
Bayesian methods; Graphical models; Hierarchical models; Particle filters; PARTICLE FILTER; INFERENCE;
D O I
10.1080/10618600.2016.1237363
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We propose a novel class of Sequential Monte Carlo (SMC) algorithms, appropriate for inference in probabilistic graphical models. This class of algorithms adopts a divide-and-conquer approach based upon an auxiliary tree-structured decomposition of the model of interest, turning the overall inferential task into a collection of recursively solved subproblems. The proposed method is applicable to a broad class of probabilistic graphical models, including models with loops. Unlike a standard SMC sampler, the proposed divide-and-conquer SMC employs multiple independent populations of weighted particles, which are resampled, merged, and propagated as the method progresses. We illustrate empirically that this approach can outperform standard methods in terms of the accuracy of the posterior expectation and marginal likelihood approximations. Divide-and-conquer SMC also opens up novel parallel implementation options and the possibility of concentrating the computational effort on the most challenging subproblems. We demonstrate its performance on a Markov random field and on a hierarchical logistic regression problem. Supplementary materials including proofs and additional numerical results are available online.
引用
收藏
页码:445 / 458
页数:14
相关论文
共 34 条
[1]   Particle Markov chain Monte Carlo methods [J].
Andrieu, Christophe ;
Doucet, Arnaud ;
Holenstein, Roman .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2010, 72 :269-342
[2]  
[Anonymous], 2011, The Oxford Handbook of Nonlinear Filtering
[3]  
[Anonymous], 2004, PROB APPL S
[4]  
[Anonymous], J STAT SOFT IN PRESS
[5]  
[Anonymous], 2013, INT C MACH LEARN
[6]  
BESAG J, 1974, J ROY STAT SOC B MET, V36, P192
[7]   ON THE CONVERGENCE OF ADAPTIVE SEQUENTIAL MONTE CARLO METHODS [J].
Beskos, Alexandros ;
Jasra, Ajay ;
Kantas, Nikolas ;
Thiery, Alexandre .
ANNALS OF APPLIED PROBABILITY, 2016, 26 (02) :1111-1146
[8]  
Bishop C., 2006, Pattern recognition and machine learning, P423
[9]   Phylogenetic Inference via Sequential Monte Carlo [J].
Bouchard-Cote, Alexandre ;
Sankararaman, Sriram ;
Jordan, Michael I. .
SYSTEMATIC BIOLOGY, 2012, 61 (04) :579-593
[10]  
Briers M, 2005, 2005 7TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), VOLS 1 AND 2, P705