PHYSICS-INFORMED GENERATIVE ADVERSARIAL NETWORKS FOR STOCHASTIC DIFFERENTIAL EQUATIONS

被引:273
作者
Yang, Liu [1 ]
Zhang, Dongkun [1 ]
Karniadakis, George Em [1 ]
机构
[1] Brown Univ, Div Appl Math, Providence, RI 02912 USA
关键词
WGAN-GP; multiplayer GANs; high-dimensional problems; inverse problems; elliptic stochastic problems;
D O I
10.1137/18M1225409
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We developed a new class of physics-informed generative adversarial networks (PI-GANs) to solve forward, inverse, and mixed stochastic problems in a unified manner based on a limited number of scattered measurements. Unlike standard GANs relying solely on data for training, here we encoded into the architecture of GANs the governing physical laws in the form of stochastic differential equations (SDEs) using automatic differentiation. In particular, we applied Wasserstein GANs with gradient penalty (WGAN-GP) for its enhanced stability compared to vanilla GANs. We first tested WGAN-GP in approximating Gaussian processes of different correlation lengths based on data realizations collected from simultaneous reads at sparsely placed sensors. We obtained good approximation of the generated stochastic processes to the target ones even if there is a mismatch between the input noise dimensionality and the effective dimensionality of the target stochastic processes. We also studied the overfitting issue for both the discriminator and the generator, and we found that overfitting occurs also in the generator in addition to the discriminator as previously reported. Subsequently, we considered the solution of elliptic SDEs requiring approximations of three stochastic processes, namely the solution, the forcing, and the diffusion coefficient. Here again, we assumed data collected from simultaneous reads at a limited number of sensors for the multiple stochastic processes. Three generators were used for the PI-GANs: two of them were feed forward deep neural networks (DNNs), while the other one was the neural network induced by the SDE. For the case where we have one group of data, we employed one feed forward DNN as the discriminator, while for the case of multiple groups of data we employed multiple discriminators in PI-GANs. We solved forward, inverse, and mixed problems without changing the framework of PI-GANs, obtaining both the means and the standard deviations of the stochastic solution and the diffusion coefficient in good agreement with benchmarks. In this work, we have demonstrated the effectiveness of PI-GANs in solving SDEs for about 120 dimensions. In principle, PI-GANs could tackle very high dimensional problems given more sensor data with low-polynomial growth in computational cost.
引用
收藏
页码:A292 / A317
页数:26
相关论文
共 42 条
[1]  
[Anonymous], 2017, Objective-Reinforced Generative Adversarial Networks (ORGAN) for Sequence Generation Models
[2]  
[Anonymous], 2017, RECURRENT TOPIC TRAN
[3]  
[Anonymous], 2017, P INT C LEARN REPR
[4]  
[Anonymous], PHYS INFORMED DEEP 2
[5]  
[Anonymous], DEEP LEARNING BASED
[6]  
Baydin AG, 2018, J MACH LEARN RES, V18
[7]  
Berthelot David, 2017, arXiv, DOI DOI 10.48550/ARXIV.1703.10717
[8]  
Bilionis I., 2016, PROBABILISTIC SOLVER
[9]   Discovering governing equations from data by sparse identification of nonlinear dynamical systems [J].
Brunton, Steven L. ;
Proctor, Joshua L. ;
Kutz, J. Nathan .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2016, 113 (15) :3932-3937
[10]  
Fedus W., 2018, MaskGaN: Better text generation via filling in the