Training physics-informed neural networks: One learning to rule them all?

被引:20
作者
Monaco, Simone [1 ]
Apiletti, Daniele [1 ]
机构
[1] Politecn Torino, Dept Control & Comp Engn, I-10129 Turin, Italy
关键词
Deep learning; Partial differential equations; Computational physics; FRAMEWORK;
D O I
10.1016/j.rineng.2023.101023
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Physics-informed neural networks (PINNs) are gaining popularity as powerful tools for solving nonlinear Partial Differential Equations (PDEs) through Deep Learning. PINNs are trained by incorporating physical laws as soft constraints in the loss function. Such an approach is effective for trivial equations, but fails in solving various classes of more complex dynamical systems. In this work, we put on the test bench three state-of-the-art PINN training methods for solving three popular Partial Differential Equations (PDEs) of increasing complexity, besides the additional application of the Fourier Feature Embedding (FFE), and the introduction of a novel implementation of Curriculum regularization. Experiments evaluate the convergence of the trained PINN and its prediction error rate for different training sizes and training lengths (i.e., number of epochs). To provide an overview of the behaviour of each learning method, we introduce a new metric, named overall score. Our experiments show that a given approach can either be the best in all situations or not converge at all. The same PDE can be solved with different learning methods, which in turn give the best results, depending on the training size or the use of FFE. From our experiments we conclude that there is no learning method to train them all, yet we extract useful patterns that can drive future works in this growing area of research. All code and data of this manuscript are publicly available on GitHub.
引用
收藏
页数:9
相关论文
共 28 条
[1]   Neural network method for solving partial differential equations [J].
Aarts, LP ;
van der Veer, P .
NEURAL PROCESSING LETTERS, 2001, 14 (03) :261-271
[2]   Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning [J].
Alipanahi, Babak ;
Delong, Andrew ;
Weirauch, Matthew T. ;
Frey, Brendan J. .
NATURE BIOTECHNOLOGY, 2015, 33 (08) :831-+
[3]   Searching for exotic particles in high-energy physics with deep learning [J].
Baldi, P. ;
Sadowski, P. ;
Whiteson, D. .
NATURE COMMUNICATIONS, 2014, 5
[4]  
Cai S., 2022, ACTA MECH SINICA-PRC, P1, DOI DOI 10.1007/s10409-021-01148-1
[5]  
Chen X, 2019, ADV NEURAL INF PROCE, V32
[6]   Physics-informed neural networks for inverse problems in nano-optics and metamaterials [J].
Chen, Yuyao ;
Lu, Lu ;
Karniadakis, George Em ;
Dal Negro, Luca .
OPTICS EXPRESS, 2020, 28 (08) :11618-11633
[7]   A method for representing periodic functions and enforcing exactly periodic boundary conditions with deep neural networks [J].
Dong, Suchuan ;
Ni, Naxian .
JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 435
[8]   Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations [J].
Jagtap, Ameya D. ;
Karniadakis, George Em .
COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2020, 28 (05) :2002-2041
[9]   Adaptive activation functions accelerate convergence in deep and physics-informed neural networks [J].
Jagtap, Ameya D. ;
Kawaguchi, Kenji ;
Karniadakis, George Em .
JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
[10]   Physics-informed machine learning [J].
Karniadakis, George Em ;
Kevrekidis, Ioannis G. ;
Lu, Lu ;
Perdikaris, Paris ;
Wang, Sifan ;
Yang, Liu .
NATURE REVIEWS PHYSICS, 2021, 3 (06) :422-440