Optimally weighted loss functions for solving PDEs with Neural Networks

被引:68
作者
van der Meer, Remco W. [1 ,2 ]
Oosterlee, Cornelis [3 ]
Borovykh, Anastasia [4 ]
机构
[1] CWI, Sci Pk 123, NL-1098 XG Amsterdam, Netherlands
[2] Delft Univ Technol, Van Mourik Broekmanweg 6, NL-2628 XE Delft, Netherlands
[3] Univ Utrecht, Math Inst, Utrecht, Netherlands
[4] Univ Warwick, Warwick Business Sch, Coventry CV4 7AL, W Midlands, England
关键词
Partial differential equation; Neural network; Convection-diffusion equation; Poisson equation; Loss functional; High-dimensional problems; ALGORITHM;
D O I
10.1016/j.cam.2021.113887
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Recent works have shown that deep neural networks can be employed to solve partial differential equations, giving rise to the framework of physics informed neural networks (Raissi et al., 2007). We introduce a generalization for these methods that manifests as a scaling parameter which balances the relative importance of the different constraints imposed by partial differential equations. A mathematical motivation of these generalized methods is provided, which shows that for linear and well-posed partial differential equations, the functional form is convex. We then derive a choice for the scaling parameter that is optimal with respect to a measure of relative error. Because this optimal choice relies on having full knowledge of analytical solutions, we also propose a heuristic method to approximate this optimal choice. The proposed methods are compared numerically to the original methods on a variety of model partial differential equations, with the number of data points being updated adaptively. For several problems, including high-dimensional PDEs the proposed methods are shown to significantly enhance accuracy. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:18
相关论文
共 28 条
[1]  
Bach F., 2017, J MACH LEARN RES, V18, P629
[2]   Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black-Scholes Partial Differential Equations [J].
Berner, Julius ;
Grohs, Philipp ;
Jentzen, Arnulf .
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (03) :631-657
[3]   A LIMITED MEMORY ALGORITHM FOR BOUND CONSTRAINED OPTIMIZATION [J].
BYRD, RH ;
LU, PH ;
NOCEDAL, J ;
ZHU, CY .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1995, 16 (05) :1190-1208
[4]   Transfer learning based multi-fidelity physics informed deep neural network [J].
Chakraborty, Souvik .
JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 426
[5]   Machine Learning for Semi Linear PDEs [J].
Chan-Wai-Nam, Quentin ;
Mikael, Joseph ;
Warin, Xavier .
JOURNAL OF SCIENTIFIC COMPUTING, 2019, 79 (03) :1667-1712
[6]   Overcoming the curse of dimensionality for some Hamilton-Jacobi partial differential equations via neural network architectures [J].
Darbon, Jerome ;
Langlois, Gabriel P. ;
Meng, Tingwei .
RESEARCH IN THE MATHEMATICAL SCIENCES, 2020, 7 (03)
[7]  
Dockhorn T, 2019, ARXIV
[8]  
Draxler F, 2018, PR MACH LEARN RES, V80
[9]  
Glorot X., 2010, Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, P249, DOI DOI 10.1109/LGRS.2016.2565705
[10]  
Grohs P., 2018, ARXIV180902362