Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian Processes

被引:17
作者
Tsilifis, Panagiotis [1 ]
Pandita, Piyush [1 ]
Ghosh, Sayan [1 ]
Andreoli, Valeria [2 ]
Vandeputte, Thomas [2 ]
Wang, Liping [1 ]
机构
[1] Gen Elect Res, Probabilist Design & Optimizat Grp, Niskayuna, NY 12309 USA
[2] Gen Elect Res, Aerodynam & Computat Fluid Dynam Grp, Niskayuna, NY 12309 USA
关键词
Gaussian Process regression; Multi-fidelity simulations; Dimension reduction; Geodesic Monte Carlo; Bayesian inference; Uncertainty propagation; POLYNOMIAL CHAOS; DIMENSIONALITY REDUCTION; UNCERTAINTY PROPAGATION; DESIGN; MODEL; APPROXIMATIONS; OPTIMIZATION; ADAPTATION; EQUATIONS; SUPPORT;
D O I
10.1016/j.cma.2021.114147
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Uncertainty propagation in complex engineering systems often poses significant computational challenges related to modeling and quantifying probability distributions of model outputs, as those emerge as the result of various sources of uncertainty that are inherent in the system under investigation. Gaussian Processes regression (GPs) is a robust meta-modeling technique that allows for fast model prediction and exploration of response surfaces. Multi-fidelity variations of GPs further leverage information from cheap and low fidelity model simulations in order to improve their predictive performance on the high fidelity model. In order to cope with the high volume of data required to train GPs in high dimensional design spaces, a common practice is to introduce latent design variables that are typically projections of the original input space to a lower dimensional subspace, and therefore substitute the problem of learning the initial high dimensional mapping, with that of training a GP on a low dimensional space. In this paper, we present a Bayesian approach to identify optimal transformations that map the input points to low dimensional latent variables. The "projection" mapping consists of an orthonormal matrix that is considered a priori unknown and needs to be inferred jointly with the GP parameters, conditioned on the available training data. The proposed Bayesian inference scheme relies on a two-step iterative algorithm that samples from the marginal posteriors of the GP parameters and the projection matrix respectively, both using Markov Chain Monte Carlo (MCMC) sampling. In order to take into account the orthogonality constraints imposed on the orthonormal projection matrix, a Geodesic Monte Carlo sampling algorithm is employed, that is suitable for exploiting probability measures on manifolds. We extend the proposed framework to multi-fidelity models using GPs including the scenarios of training multiple outputs together. We validate our framework on three synthetic problems with a known lower-dimensional subspace. The benefits of our proposed framework, are illustrated on the computationally challenging aerodynamic optimization of a last-stage blade for an industrial gas turbine, where we study the effect of an 85-dimensional shape parameterization of a three-dimensional airfoil on two output quantities of interest, specifically on the aerodynamic efficiency and the degree of reaction. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:23
相关论文
共 68 条
[11]   Markov Chain Monte Carlo Inference of Parametric Dictionaries for Sparse Bayesian Approximations [J].
Chaspari, Theodora ;
Tsiartas, Andreas ;
Tsilifis, Panagiotis ;
Narayanan, Shrikanth S. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (12) :3077-3092
[12]   Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference [J].
Chen, Peng ;
Zabaras, Nicholas ;
Bilionis, Ilias .
JOURNAL OF COMPUTATIONAL PHYSICS, 2015, 284 :291-333
[13]  
Chikuse Y., 2012, Statistics on special manifolds, V174
[14]   ACTIVE SUBSPACE METHODS IN THEORY AND PRACTICE: APPLICATIONS TO KRIGING SURFACES [J].
Constantine, Paul G. ;
Dow, Eric ;
Wang, Qiqi .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2014, 36 (04) :A1500-A1524
[15]  
Constantine PG., 2015, Active Subspaces: Emerging Ideas for Dimension Reduction in Parameter Studies
[16]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[17]  
Cristianini N, 2000, SUPPORT VECTOR MACHI, V2
[18]   Multi-fidelity optimization via surrogate modelling [J].
Forrester, Alexander I. J. ;
Sobester, Andras ;
Keane, Andy J. .
PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2007, 463 (2088) :3251-3269
[19]  
Garnett R, 2014, UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, P230
[20]  
Gelman Andrew, 1996, Bayesian Statistics, V5, P42