Scaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural Networks

被引:9
作者
Lan, Shiwei [1 ]
Li, Shuyi [1 ]
Shahbaba, Babak [2 ]
机构
[1] Arizona State Univ, Sch Math & Stat Sci, Tempe, AZ 85287 USA
[2] Univ Calif Irvine, Dept Stat, Irvine, CA 92697 USA
关键词
Bayesian inverse problems; ensemble Kalman methods; emulation; convolutional neural network; dimension reduction; autoencoder; ENSEMBLE KALMAN FILTER; MONTE-CARLO METHODS; MCMC METHODS; MODEL; ASSIMILATION; RECOGNITION; CALIBRATION; GAME; GO;
D O I
10.1137/21M1439456
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Due to the importance of uncertainty quantification (UQ), the Bayesian approach to inverse problems has recently gained popularity in applied mathematics, physics, and engineering. However, traditional Bayesian inference methods based on Markov chain Monte Carlo (MCMC) tend to be computationally intensive and inefficient for such high-dimensional problems. To address this issue, several methods based on surrogate models have been proposed to speed up the inference process. More specifically, the calibration-emulation-sampling (CES) scheme has been proven to be successful in large dimensional UQ problems. In this work, we propose a novel CES approach for Bayesian inference based on deep neural network models for the emulation phase. The resulting algorithm is computationally more efficient and more robust against variations in the training set. Further, by using an autoencoder (AE) for dimension reduction, we have been able to speed up our Bayesian inference method up to three orders of magnitude. Overall, our method, henceforth called the dimension-reduced emulative autoencoder Monte Carlo (DREAMC) algorithm, is able to scale Bayesian UQ up to thousands of dimensions for inverse problems. Using two low-dimensional (linear and nonlinear) inverse problems, we illustrate the validity of this approach. Next, we apply our method to two high-dimensional numerical examples (elliptic and advection-diffusion) to demonstrate its computational advantages over existing algorithms.
引用
收藏
页码:1684 / 1713
页数:30
相关论文
共 100 条
[1]   The Ensemble Kalman Filter in Reservoir Engineering-a Review [J].
Aanonsen, Sigurd I. ;
Naevdal, Geir ;
Oliver, Dean S. ;
Reynolds, Albert C. ;
Valles, Brice .
SPE JOURNAL, 2009, 14 (03) :393-412
[2]  
[Anonymous], 2003, DESIGN ANAL COMPUTER
[3]  
[Anonymous], 2014, P 5 ACM C BIOINF COM
[4]  
[Anonymous], 2018, P 6 INT C LEARN REPR
[5]  
Barron A. R., 1992, 7 YAL WORKSH AD LEAR
[6]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[7]   Hybrid Monte Carlo on Hilbert spaces [J].
Beskos, A. ;
Pinski, F. J. ;
Sanz-Serna, J. M. ;
Stuart, A. M. .
STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2011, 121 (10) :2201-2230
[8]   MCMC methods for diffusion bridges [J].
Beskos, Alexandros ;
Roberts, Gareth ;
Stuart, Andrew ;
Voss, Jochen .
STOCHASTICS AND DYNAMICS, 2008, 8 (03) :319-350
[9]   Geometric MCMC for infinite-dimensional inverse problems [J].
Beskos, Alexandros ;
Girolami, Mark ;
Lan, Shiwei ;
Farrell, Patrick E. ;
Stuart, Andrew M. .
JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 335 :327-351
[10]   A stable manifold MCMC method for high dimensions [J].
Beskos, Alexandros .
STATISTICS & PROBABILITY LETTERS, 2014, 90 :46-52