Accelerating Proximal Markov Chain Monte Carlo by Using an Explicit Stabilized Method

被引:23
作者
Pereyra, Marcelo [1 ,2 ]
Mieles, Luis Vargas [1 ,2 ,3 ]
Zygalakis, Konstantinos C. [2 ,3 ]
机构
[1] Heriot Watt Univ, Sch Math & Comp Sci, Edinburgh EH14 4AS, Midlothian, Scotland
[2] Bayes Ctr, Maxwell Inst Math Sci, Edinburgh EH8 9BT, Midlothian, Scotland
[3] Univ Edinburgh, Sch Math, Edinburgh EH9 3FD, Midlothian, Scotland
关键词
mathematical imaging; inverse problems; Bayesian inference; Markov chain Monte Carlo methods; proximal algorithms; MEAN-SQUARE; LANGEVIN; REGULARIZATION; OPTIMIZATION; ALGORITHM; NOISE; STIFF;
D O I
10.1137/19M1283719
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a highly efficient proximal Markov chain Monte Carlo methodology to perform Bayesian computation in imaging problems. Similarly to previous proximal Monte Carlo approaches, the proposed method is derived from an approximation of the Langevin diffusion. However, instead of the conventional Euler-Maruyama approximation that underpins existing proximal Monte Carlo methods, here we use a state-of-the-art orthogonal Runge-Kutta-Chebyshev stochastic approximation [A. Abdulle, I. Aimuslimani, and G. Vilmart, SIAM/ASA J. Uncertain. Quantif., 6 (2018), pp. 937-964] that combines several gradient evaluations to significantly accelerate its convergence speed, similarly to accelerated gradient optimization methods. The proposed methodology is demonstrated via a range of numerical experiments, including non-blind image deconvolution, hyperspectral unmixing, and tomographic reconstruction, with total-variation and l(1)-type priors. Comparisons with Euler-type proximal Monte Carlo methods confirm that the Markov chains generated with our method exhibit significantly faster convergence speeds, achieve larger effective sample sizes, and produce lower mean-square estimation errors at equal computational budget.
引用
收藏
页码:905 / 935
页数:31
相关论文
共 52 条
  • [1] Abdulle A., 2015, Encyclopedia of Applied and Computational Mathematics, P460, DOI DOI 10.1007/978-3-540-70529-1_100
  • [2] Optimal Explicit Stabilized Integrator of Weak Order 1 for Stiff and Ergodic Stochastic Differential Equations
    Abdulle, Assyr
    Almuslimani, Ibrahim
    Vilmart, Gilles
    [J]. SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2018, 6 (02): : 937 - 964
  • [3] HIGH ORDER NUMERICAL APPROXIMATION OF THE INVARIANT MEASURE OF ERGODIC SDES
    Abdulle, Assyr
    Vilmart, Gilles
    Zygalakis, Konstantinos C.
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 2014, 52 (04) : 1600 - 1622
  • [4] S-ROCK: Chebyshev methods for stiff stochastic differential equations
    Abdulle, Assyr
    Cirilli, Stephane
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2008, 30 (02) : 997 - 1014
  • [5] Fast Image Recovery Using Variable Splitting and Constrained Optimization
    Afonso, Manya V.
    Bioucas-Dias, Jose M.
    Figueiredo, Mario A. T.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2010, 19 (09) : 2345 - 2356
  • [6] [Anonymous], 2007, BAYESIAN CHOICE DECI
  • [7] [Anonymous], ARXIV180507199
  • [8] [Anonymous], 2019, J MACHINE LEARNING R
  • [9] [Anonymous], 1992, Statistical science
  • [10] Solving inverse problems using data-driven models
    Arridge, Simon
    Maass, Peter
    Oktem, Ozan
    Schonlieb, Carola-Bibiane
    [J]. ACTA NUMERICA, 2019, 28 : 1 - 174