A novel meta-learning initialization method for physics-informed neural networks

被引:0
作者
Xu Liu
Xiaoya Zhang
Wei Peng
Weien Zhou
Wen Yao
机构
[1] Chinese Academy of in the systems Military Science,Defense Innovation Institute
来源
Neural Computing and Applications | 2022年 / 34卷
关键词
Physics-informed neural networks; Partial differential equations; Reptile initialization; Accelerated training;
D O I
暂无
中图分类号
学科分类号
摘要
Physics-informed neural networks (PINNs) have been widely used to solve various scientific computing problems. However, large training costs limit PINNs for some real-time applications. Although some works have been proposed to improve the training efficiency of PINNs, few consider the influence of initialization. To this end, we propose a New Reptile initialization-based Physics-Informed Neural Network (NRPINN). The original Reptile algorithm is a meta-learning initialization method based on labeled data. PINNs can be trained with less labeled data or even without any labeled data by adding partial differential equations (PDEs) as a penalty term into the loss function. Inspired by this idea, we propose the new Reptile initialization to sample more tasks from the parameterized PDEs and adapt the penalty term of the loss. The new Reptile initialization can acquire initialization parameters from related tasks by supervised, unsupervised, and semi-supervised learning. Then, PINNs with initialization parameters can efficiently solve PDEs. Besides, the new Reptile initialization can also be used for the variants of PINNs. Finally, we demonstrate and verify the NRPINN considering both forward problems, including solving Poisson, Burgers, and Schrödinger equations, as well as inverse problems, where unknown parameters in the PDEs are estimated. Experimental results show that the NRPINN training is much faster and achieves higher accuracy than PINNs with other initialization methods.
引用
收藏
页码:14511 / 14534
页数:23
相关论文
共 109 条
[1]  
Basdevant C(1986)Spectral and finite difference solutions of the burgers equation Comput Fluids 14 23-41
[2]  
Deville M(2017)Automatic differentiation in machine learning: a survey J Mach Learn Res 18 5595-5637
[3]  
Haldenwang P(1991)Stochastic gradient learning in neural networks J Neurosci 91 12-11633
[4]  
Lacroix J(2020)Physics-informed neural networks for inverse problems in nano-optics and metamaterials Opt Express 28 11618-392
[5]  
Ouazzani J(2002)The Poisson–Boltzmann equation for biomolecular electrostatics: a tool for structural biology JMR 15 377-2041
[6]  
Peyret R(2020)Transfer learning enhanced physics informed neural network for phase-field modeling of fracture Theor Appl Fract Mech 106 102447-3163
[7]  
Orlandi P(2020)Similarities in a fifth-order evolution equation with and with no singular kernel Chaos Solitons Fractals 130 109467-1672
[8]  
Patera A(2020)Extended physics-informed neural networks (xpinns): a generalized space-time domain decomposition based deep learning framework for nonlinear partial differential equations CiCP 28 2002-6080
[9]  
Baydin AG(2020)Adaptive activation functions accelerate convergence in deep and physics-informed neural networks J Comput Phys 404 109136-5578
[10]  
Pearlmutter BA(2020)Conservative physics-informed neural networks on discrete domains for conservation laws: applications to forward and inverse problems Comput Methods Appl Mech Eng 365 113028-1268