Finite basis physics-informed neural networks (FBPINNs): a scalable domain decomposition approach for solving differential equations

被引:0
作者
Ben Moseley
Andrew Markham
Tarje Nissen-Meyer
机构
[1] ETH Zürich,AI Center
[2] University of Oxford,Department of Computer Science
[3] University of Oxford,Department of Earth Sciences
来源
Advances in Computational Mathematics | 2023年 / 49卷
关键词
Physics-informed neural networks; Domain decomposition; Multi-scale modelling; Forward modelling; Differential equations; Parallel computing; 65M99; 68T01;
D O I
暂无
中图分类号
学科分类号
摘要
Recently, physics-informed neural networks (PINNs) have offered a powerful new paradigm for solving problems relating to differential equations. Compared to classical numerical methods, PINNs have several advantages, for example their ability to provide mesh-free solutions of differential equations and their ability to carry out forward and inverse modelling within the same optimisation problem. Whilst promising, a key limitation to date is that PINNs have struggled to accurately and efficiently solve problems with large domains and/or multi-scale solutions, which is crucial for their real-world application. Multiple significant and related factors contribute to this issue, including the increasing complexity of the underlying PINN optimisation problem as the problem size grows and the spectral bias of neural networks. In this work, we propose a new, scalable approach for solving large problems relating to differential equations called finite basis physics-informed neural networks (FBPINNs). FBPINNs are inspired by classical finite element methods, where the solution of the differential equation is expressed as the sum of a finite set of basis functions with compact support. In FBPINNs, neural networks are used to learn these basis functions, which are defined over small, overlapping subdomains. FBINNs are designed to address the spectral bias of neural networks by using separate input normalisation over each subdomain and reduce the complexity of the underlying optimisation problem by using many smaller neural networks in a parallel divide-and-conquer approach. Our numerical experiments show that FBPINNs are effective in solving both small and larger, multi-scale problems, outperforming standard PINNs in both accuracy and computational resources required, potentially paving the way to the application of PINNs on large, real-world problems.
引用
收藏
相关论文
共 78 条
[1]  
Giorgi F(2019)Thirty years of regional climate modeling: where are we and where are we going next? Journal of Geophysical Research: Atmospheres 124 5696-5723
[2]  
Prein AF(2015)A review on regional convection-permitting climate modeling: demonstrations, prospects, and challenges Rev. Geophys. 53 323-361
[3]  
Langhans W(2009)OpenFOAM: open source CFD in research and industry International Journal of Naval Architecture and Ocean Engineering 1 89-94
[4]  
Fosser G(2019)AxiSEM3D: broad-band seismic wavefields in 3-D global earth models with undulating discontinuities Geophys. J. Int. 217 2125-2146
[5]  
Ferrone A(2020)Deep learning for fast simulation of seismic waves in complex media Solid Earth 11 1527-1549
[6]  
Ban N(2019)Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations J. Comput. Phys. 378 686-707
[7]  
Goergen K(2020)Hidden fluid mechanics: learning velocity and pressure fields from flow visualizations Science 367 1026-1030
[8]  
Keller M(2020)Physics-informed neural networks for cardiac activation mapping Frontiers in Physics 8 42-229
[9]  
Tölle M(2021)B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data J. Comput. Phys. 425 218-2041
[10]  
Gutjahr O(2021)Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators Nature Machine Intelligence 3 1931-316