A novel meta-learning initialization method for physics-informed neural networks

被引:45
作者
Liu, Xu [1 ]
Zhang, Xiaoya [1 ]
Peng, Wei [1 ]
Zhou, Weien [1 ]
Yao, Wen [1 ]
机构
[1] Chinese Acad Syst Mil Sci, Def Innovat Inst, Beijing 100071, Peoples R China
基金
中国国家自然科学基金;
关键词
Physics-informed neural networks; Partial differential equations; Reptile initialization; Accelerated training; EQUATION; FRAMEWORK; MODEL;
D O I
10.1007/s00521-022-07294-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Physics-informed neural networks (PINNs) have been widely used to solve various scientific computing problems. However, large training costs limit PINNs for some real-time applications. Although some works have been proposed to improve the training efficiency of PINNs, few consider the influence of initialization. To this end, we propose a New Reptile initialization-based Physics-Informed Neural Network (NRPINN). The original Reptile algorithm is a meta-learning initialization method based on labeled data. PINNs can be trained with less labeled data or even without any labeled data by adding partial differential equations (PDEs) as a penalty term into the loss function. Inspired by this idea, we propose the new Reptile initialization to sample more tasks from the parameterized PDEs and adapt the penalty term of the loss. The new Reptile initialization can acquire initialization parameters from related tasks by supervised, unsupervised, and semi-supervised learning. Then, PINNs with initialization parameters can efficiently solve PDEs. Besides, the new Reptile initialization can also be used for the variants of PINNs. Finally, we demonstrate and verify the NRPINN considering both forward problems, including solving Poisson, Burgers, and Schrodinger equations, as well as inverse problems, where unknown parameters in the PDEs are estimated. Experimental results show that the NRPINN training is much faster and achieves higher accuracy than PINNs with other initialization methods.
引用
收藏
页码:14511 / 14534
页数:24
相关论文
共 60 条
[1]   Neural networks catching up with finite differences in solving partial differential equations in higher dimensions [J].
Avrutskiy, Vsevolod I. .
NEURAL COMPUTING & APPLICATIONS, 2020, 32 (17) :13425-13440
[2]   SPECTRAL AND FINITE-DIFFERENCE SOLUTIONS OF THE BURGERS-EQUATION [J].
BASDEVANT, C ;
DEVILLE, M ;
HALDENWANG, P ;
LACROIX, JM ;
OUAZZANI, J ;
PEYRET, R ;
ORLANDI, P ;
PATERA, AT .
COMPUTERS & FLUIDS, 1986, 14 (01) :23-41
[3]  
Bates R.L. Jackson., 1987, GLOSSARY GEOLOGY, P788
[4]  
Baydin AG, 2018, J MACH LEARN RES, V18
[5]  
Berezin F A., 2012, The Schrodinger Equation
[6]  
Bottou Leon, 1991, PROC NEURONIMES, V91, P12
[7]   The neural network collocation method for solving partial differential equations [J].
Brink, Adam R. ;
Najera-Flores, David A. ;
Martinez, Cari .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (11) :5591-5608
[8]  
Chakraborty S, 2020, ARXIV PREPRINT ARXIV
[9]   Physics-informed neural networks for inverse problems in nano-optics and metamaterials [J].
Chen, Yuyao ;
Lu, Lu ;
Karniadakis, George Em ;
Dal Negro, Luca .
OPTICS EXPRESS, 2020, 28 (08) :11618-11633
[10]   Physics-Informed Neural Networks for Cardiac Activation Mapping [J].
Costabal, Francisco Sahli ;
Yang, Yibo ;
Perdikaris, Paris ;
Hurtado, Daniel E. ;
Kuhl, Ellen .
FRONTIERS IN PHYSICS, 2020, 8