Sparse-grid, reduced-basis Bayesian inversion

被引:34
作者
Chen, Peng [1 ]
Schwab, Christoph [1 ]
机构
[1] ETH, Seminar Angew Math, CH-8092 Zurich, Switzerland
基金
欧洲研究理事会;
关键词
Bayesian inversion; Reduced basis; Sparse grid; Error estimates; Best N-term convergence; Curse of dimensionality; PARTIAL-DIFFERENTIAL-EQUATIONS; POSTERIORI ERROR ESTIMATION; MODEL-REDUCTION; POLYNOMIAL-APPROXIMATION; ANALYTIC REGULARITY; GREEDY ALGORITHMS; PARAMETER;
D O I
10.1016/j.cma.2015.08.006
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
We analyze reduced basis (RB for short) short acceleration of recently proposed sparse Bayesian inversion algorithms for partial differential equations with uncertain distributed parameter, for observation data subject to additive, Gaussian observation noise. Specifically, Bayesian inversion of affine-parametric, linear operator families on possibly high-dimensional parameter spaces. We consider "high-fidelity" Petrov-Galerkin (PG) discretizations of these countably-parametric operator families: we allow general families of inf-sup stable, PG Finite-Element methods, covering most conforming primal and mixed Finite-Element discretizations of standard problems in mechanics. RB acceleration of the high-dimensional, parametric forward response maps which need to be numerically solved numerous times in Bayesian inversion is proposed and convergence rate bounds for the error in the Bayesian estimate incurred by the use of RB are derived. As consequence of recent theoretical results on dimension-independent sparsity of parametric responses, and preservation of sparsity for holomorphic-parametric problems, we establish new convergence rates of greedy RB compressions for both, the parametric forward maps as well as for the countably-parametric posterior densities which arise in Bayesian inversion. We show that the convergence rates for the RB compressions of the parametric forward maps as well as of the countably-parametric, sparse Bayesian posterior densities are free from the curse of dimensionality and depend only on the sparsity of the uncertain input data. In particular, we establish the quadratic convergence of the RB compression for the posterior densities with respect to that for the parametric forward maps. Numerical experiments for linear elliptic, affine-parametric model problems in two space dimensions with hundreds of parameters are reported which confirm that the proposed adaptive sparse grid reduced basis algorithms indeed exploit sparsity of both, the parametric forward maps as well as the Bayesian posterior density. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:84 / 115
页数:32
相关论文
共 54 条
[1]  
[Anonymous], 12 EPFL MATHICSE
[2]  
[Anonymous], 2007, Numerical mathematics, DOI DOI 10.1007/B98885
[3]  
[Anonymous], ARXIV14034290
[4]  
Bangerth W., 2003, LECT MATH ETH ZURICH
[5]   CONVERGENCE RATES FOR GREEDY ALGORITHMS IN REDUCED BASIS METHODS [J].
Binev, Peter ;
Cohen, Albert ;
Dahmen, Wolfgang ;
Devore, Ronald ;
Petrova, Guergana ;
Wojtaszczyk, Przemyslaw .
SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2011, 43 (03) :1457-1472
[6]   Reduced Basis Techniques for Stochastic Problems [J].
Boyaval, S. ;
Le Bris, C. ;
Lelievre, T. ;
Maday, Y. ;
Nguyen, N. C. ;
Patera, A. T. .
ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING, 2010, 17 (04) :435-454
[7]  
Brutman L., 1997, Ann. Numer. Math., V4, P111
[8]  
Bungartz HJ, 2004, ACT NUMERIC, V13, P147, DOI 10.1017/S0962492904000182
[9]  
Chen P., ETH ZUR SEM APPL MAT
[10]   Multilevel and weighted reduced basis method for stochastic optimal control problems constrained by Stokes equations [J].
Chen, Peng ;
Quarteroni, Alfio ;
Rozza, Gianluigi .
NUMERISCHE MATHEMATIK, 2016, 133 (01) :67-102