ADLGM: An efficient adaptive sampling deep learning Galerkin method

被引:11
作者
Aristotelous, Andreas C. [1 ]
Mitchell, Edward C. [2 ]
Maroulas, Vasileios [2 ]
机构
[1] Univ Akron, Dept Math, Akron, OH 44325 USA
[2] Univ Tennessee, Dept Math, Knoxville, TN 37996 USA
关键词
Machine learning; Numerical solution; Partial differential equations; INFORMED NEURAL-NETWORKS; CAHN-HILLIARD EQUATION; DISCONTINUOUS GALERKIN; FRAMEWORK; ALGORITHM; 2ND-ORDER; TIME;
D O I
10.1016/j.jcp.2023.111944
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In this paper, we devise an adaptive sampling technique applied to the deep Galerkin method (DGM), aimed to improve and speed up the training of the deep neural network when learning the solution of partial differential equations (PDEs). The proposed adaptive algorithm is inspired by the mesh adaptivity techniques used in the classical numerical PDE field. Its implementation to the DGM paradigm is natural, is done efficiently, and it is shown to improve the DGM algorithm. We demonstrate that our adaptive sampling DGM scheme is convergent and more accurate than DGM - as long as the residual mirrors the local error - at the same number of training steps and using the same or less number of training points. We present a multitude of tests applied to selected PDEs discussing the robustness of our scheme.(c) 2023 Elsevier Inc. All rights reserved.
引用
收藏
页数:17
相关论文
共 38 条
[11]   Geometric Deep Learning Going beyond Euclidean data [J].
Bronstein, Michael M. ;
Bruna, Joan ;
LeCun, Yann ;
Szlam, Arthur ;
Vandergheynst, Pierre .
IEEE SIGNAL PROCESSING MAGAZINE, 2017, 34 (04) :18-42
[12]  
Dockhorn T, 2019, Arxiv, DOI arXiv:1904.07200
[13]   A convergent adaptive algorithm for Poisson's equation [J].
Dorfler, W .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 1996, 33 (03) :1106-1124
[14]  
George P.L., 2007, MESH GENERATION MESH, DOI [10.1002/9780470091357.ecm012.pub2, DOI 10.1002/9780470091357.ECM012.PUB2]
[15]  
Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1
[16]   Solving high-dimensional partial differential equations using deep learning [J].
Han, Jiequn ;
Jentzen, Arnulf ;
Weinan, E. .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2018, 115 (34) :8505-8510
[17]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[18]  
Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
[19]   Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations [J].
Jagtap, Ameya D. ;
Karniadakis, George Em .
COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2020, 28 (05) :2002-2041
[20]   Adaptive activation functions accelerate convergence in deep and physics-informed neural networks [J].
Jagtap, Ameya D. ;
Kawaguchi, Kenji ;
Karniadakis, George Em .
JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)