Conditional variational autoencoder with Gaussian process regression recognition for parametric models

被引:8
作者
Zhang, Xuehan [1 ]
Jiang, Lijian [1 ]
机构
[1] Tongji Univ, Sch Math Sci, Shanghai 200092, Peoples R China
基金
中国国家自然科学基金;
关键词
Parametric models; Conditional variational autoencoder; Proper orthogonal decomposition; Gaussian process regression; PROPER ORTHOGONAL DECOMPOSITION; REDUCTION; NETWORKS;
D O I
10.1016/j.cam.2023.115532
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this article, we present a data-driven method for parametric models with noisy observation data. Gaussian process regression based reduced order modeling (GPRbased ROM) can realize fast online predictions without using equations in the offline stage. However, GPR-based ROM does not perform well for complex systems since POD projection are naturally linear. Conditional variational autoencoder (CVAE) can address this issue via nonlinear neural networks but it has more model complexity, which poses challenges for training and tuning hyperparameters. To this end, we propose a framework of CVAE with Gaussian process regression recognition (CVAEGPRR). The proposed method consists of a recognition model and a likelihood model. In the recognition model, we first extract low-dimensional features from data by POD to filter the redundant information with high frequency. And then a non-parametric model GPR is used to learn the map from parameters to POD latent variables, which can also alleviate the impact of noise. CVAE-GPRR can achieve the similar accuracy to CVAE but with fewer parameters. In the likelihood model, neural networks are used to reconstruct data. Besides the samples of POD latent variables and input parameters, physical variables are also added as the inputs to make predictions in the whole physical space. This cannot be achieved by either GPR-based ROM or CVAE. Moreover, the numerical results show that CVAE-GPRR may alleviate the overfitting issue in CVAE. (c) 2023 Published by Elsevier B.V.
引用
收藏
页数:22
相关论文
共 37 条
[1]  
Arora, 2016, ARXIV161101491, P1, DOI [DOI 10.48550/ARXIV.1611.01491, 10.48550/arXiv.1611.01491]
[2]   Structured Bayesian Gaussian process latent variable model: Applications to data-driven dimensionality reduction and high-dimensional inversion [J].
Atkinson, Steven ;
Zabaras, Nicholas .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 383 :166-195
[3]  
Ba J, 2014, ACS SYM SER
[4]   An 'empirical interpolation' method: application to efficient reduced-basis discretization of partial differential equations [J].
Barrault, M ;
Maday, Y ;
Nguyen, NC ;
Patera, AT .
COMPTES RENDUS MATHEMATIQUE, 2004, 339 (09) :667-672
[5]   EEGSourceSim: A framework for realistic simulation of EEG scalp data using MRI-based forward models and biologically plausible signals and noise [J].
Barzegaran, Elham ;
Bosse, Sebastian ;
Kohler, Peter J. ;
Norcia, Anthony M. .
JOURNAL OF NEUROSCIENCE METHODS, 2019, 328
[6]  
Benedikt J, 2018, ELECTRON J DIFFER EQ
[7]   THE PROPER ORTHOGONAL DECOMPOSITION IN THE ANALYSIS OF TURBULENT FLOWS [J].
BERKOOZ, G ;
HOLMES, P ;
LUMLEY, JL .
ANNUAL REVIEW OF FLUID MECHANICS, 1993, 25 :539-575
[8]  
Berzins A, 2021, Arxiv, DOI [arXiv:2006.13706, 10.48550/arXiv.2006.13706, DOI 10.48550/ARXIV.2006.13706]
[9]  
Bhattacharya K., 2021, The SMAI journal of computational mathematics, V7, P121
[10]  
Blundell C, 2015, PR MACH LEARN RES, V37, P1613