Analysis of a Class of Multilevel Markov Chain Monte Carlo Algorithms Based on Independent Metropolis--Hastings\ast

被引:4
|
作者
Madrigal-Cianci, Juan P. [1 ]
Nobile, Fabio [2 ]
Tempone, Raul [3 ,4 ]
机构
[1] Protocol Labs, San Francisco, CA 94104 USA
[2] Ecole Polytech Fed Lausanne, SB MATH CSQI, CH-1015 Lausanne, Switzerland
[3] KAUST, Comp Elect & Math Sci & Engn, Thuwal, Saudi Arabia
[4] Rhein Westfal TH Aachen, Math Uncertainty Quantificat, D-52062 Aachen, Germany
来源
SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION | 2023年 / 11卷 / 01期
关键词
Bayesian inversion; multilevel Monte Carlo; Markov chain Monte Carlo; uncertainty quantification; EXPLICIT ERROR-BOUNDS; UNCERTAINTY QUANTIFICATION; MCMC;
D O I
10.1137/21M1420927
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this work, we present, analyze, and implement a class of multilevel Markov chain Monte Carlo (ML-MCMC) algorithms based on independent Metropolis-Hastings proposals for Bayesian inverse problems. In this context, the likelihood function involves solving a complex differential model, which is then approximated on a sequence of increasingly accurate discretizations. The key point of this algorithm is to construct highly coupled Markov chains together with the standard multilevel Monte Carlo argument to obtain a better cost-tolerance complexity than a single-level MCMC algorithm. Our method extends the ideas of Dodwell et al., [SIAM/ASA J. Uncertain. Quantif., 3 (2015), pp. 1075--1108] to a wider range of proposal distributions. We present a thorough convergence analysis of the ML-MCMC method proposed, and show, in particular, that (i) under some mild conditions on the (independent) proposals and the family of posteriors, there exists a unique invariant probability measure for the coupled chains generated by our method, and (ii) that such coupled chains are uniformly ergodic. We also generalize the cost-tolerance theorem of Dodwell et al. to our wider class of ML-MCMC algorithms. Finally, we propose a self-tuning continuation-type ML-MCMC algorithm. The presented method is tested on an array of academic examples, where some of our theoretical results are numerically verified. These numerical experiments evidence how our extended ML-MCMC method is robust when targeting some pathological posteriors, for which some of the previously proposed ML-MCMC algorithms fail.
引用
收藏
页码:91 / 138
页数:48
相关论文
共 50 条
  • [1] Improved localisation algorithm based on Markov chain Monte Carlo-Metropolis Hastings for wireless sensor networks
    Zhou, Yucai
    Charles, Munyabugingo
    Wang, Tong
    Song, Min
    INTERNATIONAL JOURNAL OF SENSOR NETWORKS, 2020, 33 (03) : 159 - 167
  • [2] Multilevel Markov Chain Monte Carlo
    Dodwell, T. J.
    Ketelsen, C.
    Scheichl, R.
    Teckentrup, A. L.
    SIAM REVIEW, 2019, 61 (03) : 509 - 545
  • [3] MARKOV CHAIN SIMULATION FOR MULTILEVEL MONTE CARLO
    Jasra, Ajay
    Law, Kody J. H.
    Xu, Yaxian
    FOUNDATIONS OF DATA SCIENCE, 2021, 3 (01): : 27 - 47
  • [4] On adaptive Markov chain Monte Carlo algorithms
    Atchadé, YF
    Rosenthal, JS
    BERNOULLI, 2005, 11 (05) : 815 - 828
  • [5] Proximal Markov chain Monte Carlo algorithms
    Pereyra, Marcelo
    STATISTICS AND COMPUTING, 2016, 26 (04) : 745 - 760
  • [6] Proximal Markov chain Monte Carlo algorithms
    Marcelo Pereyra
    Statistics and Computing, 2016, 26 : 745 - 760
  • [7] Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis-Hastings Markov Chain Monte Carlo algorithm
    Wang, Hongrui
    Wang, Cheng
    Wang, Ying
    Gao, Xiong
    Yu, Chen
    JOURNAL OF HYDROLOGY, 2017, 549 : 476 - 483
  • [8] Exact convergence analysis of the independent Metropolis-Hastings algorithms
    Wang, Guanyang
    BERNOULLI, 2022, 28 (03) : 2012 - 2033
  • [9] The Convergence of Markov Chain Monte Carlo Methods: From the Metropolis Method to Hamiltonian Monte Carlo
    Betancourt, Michael
    ANNALEN DER PHYSIK, 2019, 531 (03)
  • [10] Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals
    South, L. F.
    Pettitt, A. N.
    Drovandi, C. C.
    BAYESIAN ANALYSIS, 2019, 14 (03): : 753 - 776