Plug-and-Play Split Gibbs Sampler: Embedding Deep Generative Priors in Bayesian Inference

被引:0
|
作者
Coeurdoux, Florentin [1 ]
Dobigeon, Nicolas [1 ]
Chainais, Pierre [2 ]
机构
[1] Univ Toulouse, CNRS, IRIT, INP ENSEEIHT, F-31071 Toulouse, France
[2] Univ Lille, Ctr Rech Informat Signal & Automat Lille CRIStAL, CNRS, Cent Lille,UMR 9189, F-59000 Lille, France
关键词
Noise reduction; Stochastic processes; Inverse problems; Data models; Bayes methods; Task analysis; Kernel; Bayesian inference; plug-and-play prior; deep generative model; diffusion-based model; Markov chain Monte Carlo; inverse problem;
D O I
10.1109/TIP.2024.3404338
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces a stochastic plug-and-play (PnP) sampling algorithm that leverages variable splitting to efficiently sample from a posterior distribution. The algorithm based on split Gibbs sampling (SGS) draws inspiration from the half quadratic splitting method (HQS) and the alternating direction method of multipliers (ADMM). It divides the challenging task of posterior sampling into two simpler sampling problems. The first problem depends on the likelihood function, while the second is interpreted as a Bayesian denoising problem that can be readily carried out by a deep generative model. Specifically, for an illustrative purpose, the proposed method is implemented in this paper using state-of-the-art diffusion-based generative models. Akin to its deterministic PnP-based counterparts, the proposed method exhibits the great advantage of not requiring an explicit choice of the prior distribution, which is rather encoded into a pre-trained generative model. However, unlike optimization methods (e.g., PnP-ADMM and PnP-HQS) which generally provide only point estimates, the proposed approach allows conventional Bayesian estimators to be accompanied by confidence intervals at a reasonable additional computational cost. Experiments on commonly studied image processing problems illustrate the efficiency of the proposed sampling strategy. Its performance is compared to recent state-of-the-art optimization and sampling methods.
引用
收藏
页码:3496 / 3507
页数:12
相关论文
共 31 条
  • [31] Amortized Bayesian inference on generative dynamical network models of epilepsy using deep neural density estimators
    Hashemi, Meysam
    Vattikonda, Anirudh N.
    Jha, Jayant
    Sip, Viktor
    Woodman, Marmaduke M.
    Bartolomei, Fabrice
    Jirsa, Viktor K.
    NEURAL NETWORKS, 2023, 163 : 178 - 194