Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian Processes

被引:17
作者
Tsilifis, Panagiotis [1 ]
Pandita, Piyush [1 ]
Ghosh, Sayan [1 ]
Andreoli, Valeria [2 ]
Vandeputte, Thomas [2 ]
Wang, Liping [1 ]
机构
[1] Gen Elect Res, Probabilist Design & Optimizat Grp, Niskayuna, NY 12309 USA
[2] Gen Elect Res, Aerodynam & Computat Fluid Dynam Grp, Niskayuna, NY 12309 USA
关键词
Gaussian Process regression; Multi-fidelity simulations; Dimension reduction; Geodesic Monte Carlo; Bayesian inference; Uncertainty propagation; POLYNOMIAL CHAOS; DIMENSIONALITY REDUCTION; UNCERTAINTY PROPAGATION; DESIGN; MODEL; APPROXIMATIONS; OPTIMIZATION; ADAPTATION; EQUATIONS; SUPPORT;
D O I
10.1016/j.cma.2021.114147
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Uncertainty propagation in complex engineering systems often poses significant computational challenges related to modeling and quantifying probability distributions of model outputs, as those emerge as the result of various sources of uncertainty that are inherent in the system under investigation. Gaussian Processes regression (GPs) is a robust meta-modeling technique that allows for fast model prediction and exploration of response surfaces. Multi-fidelity variations of GPs further leverage information from cheap and low fidelity model simulations in order to improve their predictive performance on the high fidelity model. In order to cope with the high volume of data required to train GPs in high dimensional design spaces, a common practice is to introduce latent design variables that are typically projections of the original input space to a lower dimensional subspace, and therefore substitute the problem of learning the initial high dimensional mapping, with that of training a GP on a low dimensional space. In this paper, we present a Bayesian approach to identify optimal transformations that map the input points to low dimensional latent variables. The "projection" mapping consists of an orthonormal matrix that is considered a priori unknown and needs to be inferred jointly with the GP parameters, conditioned on the available training data. The proposed Bayesian inference scheme relies on a two-step iterative algorithm that samples from the marginal posteriors of the GP parameters and the projection matrix respectively, both using Markov Chain Monte Carlo (MCMC) sampling. In order to take into account the orthogonality constraints imposed on the orthonormal projection matrix, a Geodesic Monte Carlo sampling algorithm is employed, that is suitable for exploiting probability measures on manifolds. We extend the proposed framework to multi-fidelity models using GPs including the scenarios of training multiple outputs together. We validate our framework on three synthetic problems with a known lower-dimensional subspace. The benefits of our proposed framework, are illustrated on the computationally challenging aerodynamic optimization of a last-stage blade for an industrial gas turbine, where we study the effect of an 85-dimensional shape parameterization of a three-dimensional airfoil on two output quantities of interest, specifically on the aerodynamic efficiency and the degree of reaction. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:23
相关论文
共 68 条
[1]  
[Anonymous], 1978, FDN MECH
[2]  
[Anonymous], 2009, ASPECTS MULTIVARIATE
[3]  
[Anonymous], 1993, Statistics for Spatial Data
[4]   Structured Bayesian Gaussian process latent variable model: Applications to data-driven dimensionality reduction and high-dimensional inversion [J].
Atkinson, Steven ;
Zabaras, Nicholas .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 383 :166-195
[5]   Multi-output separable Gaussian process: Towards an efficient, fully Bayesian paradigm for uncertainty quantification [J].
Bilionis, Ilias ;
Zabaras, Nicholas ;
Konomi, Bledar A. ;
Lin, Guang .
JOURNAL OF COMPUTATIONAL PHYSICS, 2013, 241 :212-239
[6]   MULTIDIMENSIONAL ADAPTIVE RELEVANCE VECTOR MACHINES FOR UNCERTAINTY QUANTIFICATION [J].
Bilionis, Ilias ;
Zabaras, Nicholas .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2012, 34 (06) :B881-B908
[7]   Multi-output local Gaussian process regression: Applications to uncertainty quantification [J].
Bilionis, Ilias ;
Zabaras, Nicholas .
JOURNAL OF COMPUTATIONAL PHYSICS, 2012, 231 (17) :5718-5746
[8]  
Bishop C. M., 2006, Pattern Recognition and Machine Learning
[9]   A LIMITED MEMORY ALGORITHM FOR BOUND CONSTRAINED OPTIMIZATION [J].
BYRD, RH ;
LU, PH ;
NOCEDAL, J ;
ZHU, CY .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1995, 16 (05) :1190-1208
[10]   Geodesic Monte Carlo on Embedded Manifolds [J].
Byrne, Simon ;
Girolami, Mark .
SCANDINAVIAN JOURNAL OF STATISTICS, 2013, 40 (04) :825-845