B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data

被引:603
作者
Yang, Liu [1 ]
Meng, Xuhui [1 ]
Karniadakis, George Em [1 ,2 ]
机构
[1] Brown Univ, Div Appl Math, Providence, RI 02912 USA
[2] Pacific Northwest Natl Lab, Richland, WA 99354 USA
关键词
Nonlinear PDEs; Noisy data; Bayesian physics-informed neural networks; Hamiltonian Monte Carlo; Variational inference;
D O I
10.1016/j.jcp.2020.109913
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We propose a Bayesian physics-informed neural network (B-PINN) to solve both forward and inverse nonlinear problems described by partial differential equations (PDEs) and noisy data. In this Bayesian framework, the Bayesian neural network (BNN) combined with a PINN for PDEs serves as the prior while the Hamiltonian Monte Carlo (HMC) or the variational inference (VI) could serve as an estimator of the posterior. B-PINNs make use of both physical laws and scattered noisy measurements to provide predictions and quantify the aleatoric uncertainty arising from the noisy data in the Bayesian framework. Compared with PINNs, in addition to uncertainty quantification, B-PINNs obtain more accurate predictions in scenarios with large noise due to their capability of avoiding overfitting. We conduct a systematic comparison between the two different approaches for the B-PINNs posterior estimation (i.e., HMC or VI), along with dropout used for quantifying uncertainty in deep neural networks. Our experiments show that HMC is more suitable than VI with mean field Gaussian approximation for the B-PINNs posterior estimation, while dropout employed in PINNs can hardly provide accurate predictions with reasonable uncertainty. Finally, we replace the BNN in the prior with a truncated Karhunen-Loeve (KL) expansion combined with HMC or a deep normalizing flow (DNF) model as posterior estimators. The KL is as accurate as BNN and much faster but this framework cannot be easily extended to high-dimensional problems unlike the BNN based framework. (c) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页数:23
相关论文
共 38 条
[21]   Model selection for dynamical systems via sparse regression and information criteria [J].
Mangan, N. M. ;
Kutz, J. N. ;
Brunton, S. L. ;
Proctor, J. L. .
PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2017, 473 (2204)
[22]   Physics-informed neural networks for high-speed flows [J].
Mao, Zhiping ;
Jagtap, Ameya D. ;
Karniadakis, George Em .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2020, 360
[23]   A composite neural network that learns from multi-fidelity data: Application to function approximation and inverse PDE problems [J].
Meng, Xuhui ;
Karniadakis, George Em .
JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 401
[24]   Neural-net-induced Gaussian process regression for function approximation and PDE solution [J].
Pang, Guofei ;
Yang, Liu ;
Karniadakis, George E. M. .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 384 :270-288
[25]   Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations [J].
Raissi, M. ;
Perdikaris, P. ;
Karniadakis, G. E. .
JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 378 :686-707
[26]   Machine learning of linear differential equations using Gaussian processes [J].
Raissi, Maziar ;
Perdikaris, Paris ;
Karniadakis, George Em .
JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 348 :683-693
[27]   Inferring solutions of differential equations using noisy multi-fidelity data [J].
Raissi, Maziar ;
Perdikaris, Paris ;
Karniadakis, George Em .
JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 335 :736-746
[28]  
Rezende Danilo Jimenez, 2015, ARXIV150505770
[29]   Data-driven discovery of partial differential equations [J].
Rudy, Samuel H. ;
Brunton, Steven L. ;
Proctor, Joshua L. ;
Kutz, J. Nathan .
SCIENCE ADVANCES, 2017, 3 (04)
[30]   Deep learning [J].
Rusk, Nicole .
NATURE METHODS, 2016, 13 (01) :35-35