A novel meta-learning initialization method for physics-informed neural networks

被引:45
作者
Liu, Xu [1 ]
Zhang, Xiaoya [1 ]
Peng, Wei [1 ]
Zhou, Weien [1 ]
Yao, Wen [1 ]
机构
[1] Chinese Acad Syst Mil Sci, Def Innovat Inst, Beijing 100071, Peoples R China
基金
中国国家自然科学基金;
关键词
Physics-informed neural networks; Partial differential equations; Reptile initialization; Accelerated training; EQUATION; FRAMEWORK; MODEL;
D O I
10.1007/s00521-022-07294-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Physics-informed neural networks (PINNs) have been widely used to solve various scientific computing problems. However, large training costs limit PINNs for some real-time applications. Although some works have been proposed to improve the training efficiency of PINNs, few consider the influence of initialization. To this end, we propose a New Reptile initialization-based Physics-Informed Neural Network (NRPINN). The original Reptile algorithm is a meta-learning initialization method based on labeled data. PINNs can be trained with less labeled data or even without any labeled data by adding partial differential equations (PDEs) as a penalty term into the loss function. Inspired by this idea, we propose the new Reptile initialization to sample more tasks from the parameterized PDEs and adapt the penalty term of the loss. The new Reptile initialization can acquire initialization parameters from related tasks by supervised, unsupervised, and semi-supervised learning. Then, PINNs with initialization parameters can efficiently solve PDEs. Besides, the new Reptile initialization can also be used for the variants of PINNs. Finally, we demonstrate and verify the NRPINN considering both forward problems, including solving Poisson, Burgers, and Schrodinger equations, as well as inverse problems, where unknown parameters in the PDEs are estimated. Experimental results show that the NRPINN training is much faster and achieves higher accuracy than PINNs with other initialization methods.
引用
收藏
页码:14511 / 14534
页数:24
相关论文
共 60 条
[21]   Physics-informed neural networks for multiphysics data assimilation with application to subsurface transport [J].
He, QiZhi ;
Barajas-Solano, David ;
Tartakovsky, Guzel ;
Tartakovsky, Alexandre M. .
ADVANCES IN WATER RESOURCES, 2020, 141
[22]   Fundamentals of traffic flow [J].
Helbing, D .
PHYSICAL REVIEW E, 1997, 55 (03) :3735-3738
[23]   Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations [J].
Jagtap, Ameya D. ;
Karniadakis, George Em .
COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2020, 28 (05) :2002-2041
[24]   Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems [J].
Jagtap, Ameya D. ;
Kharazmi, Ehsan ;
Karniadakis, George Em .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2020, 365
[25]   Adaptive activation functions accelerate convergence in deep and physics-informed neural networks [J].
Jagtap, Ameya D. ;
Kawaguchi, Kenji ;
Karniadakis, George Em .
JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
[26]  
Karniadakis GE., 2020, ARXIV PREPRINT ARXIV
[27]   hp-VPINNs: Variational physics-informed neural networks with domain decomposition [J].
Kharazmi, Ehsan ;
Zhang, Zhongqiang ;
Karniadakis, George E. M. .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 374
[28]  
Kim Y., 2020, ARXIV PREPRINT ARXIV
[29]  
Kingma DP, 2014, ADV NEUR IN, V27
[30]   A study of fractional Lotka-Volterra population model using Haar wavelet and Adams-Bashforth-Moulton methods [J].
Kumar, Sunil ;
Kumar, Ranbir ;
Agarwal, Ravi P. ;
Samet, Bessem .
MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2020, 43 (08) :5564-5578