Residual-based attention in physics-informed neural networks

被引:45
作者
Anagnostopoulos, Sokratis J. [1 ]
Toscano, Juan Diego [2 ]
Stergiopulos, Nikolaos [1 ]
Karniadakis, George Em [2 ,3 ]
机构
[1] Ecole Polytech Fed Lausanne, Lab Hemodynam & Cardiovasc Technol, ,VD, CH-1015 Lausanne, Switzerland
[2] Brown Univ, Div Appl Math, Providence, RI 02912 USA
[3] Brown Univ, Sch Engn, Providence, RI 02912 USA
基金
瑞士国家科学基金会;
关键词
Residual-based attention; PINNs accuracy; Adaptive weights; Fast convergence;
D O I
10.1016/j.cma.2024.116805
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Driven by the need for more efficient and seamless integration of physical models and data, physics -informed neural networks (PINNs) have seen a surge of interest in recent years. However, ensuring the reliability of their convergence and accuracy remains a challenge. In this work, we propose an efficient, gradient -less weighting scheme for PINNs that accelerates the convergence of dynamic or static systems. This simple yet effective attention mechanism is a bounded function of the evolving cumulative residuals and aims to make the optimizer aware of problematic regions at no extra computational cost or adversarial learning. We illustrate that this general method consistently achieves one order of magnitude faster convergence than vanilla PINNs and a minimum relative L2 error of O(10-5), on typical benchmarks of the literature. The method is further tested on the inverse solution of the Navier-Stokes within the brain perivascular spaces, where it considerably improves the prediction accuracy. Furthermore, an ablation study is performed for each case to identify the contribution of the components that enhance the vanilla PINN formulation. Evident from the convergence trajectories is the ability of the optimizer to effectively escape from poor local minima or saddle points while focusing on the challenging domain regions, which consistently have a high residual score. We believe that alongside exact boundary conditions and other model reparameterizations, this type of attention mask could be an essential element for fast training of both PINNs and neural operators.
引用
收藏
页数:19
相关论文
共 36 条
[1]  
Basir S., 2023, arXiv
[2]  
Basir S, 2022, Arxiv, DOI arXiv:2209.09988
[3]   Physics and equality constrained artificial neural networks: Application to forward and inverse problems with multi-fidelity data fusion [J].
Basir, Shamsulhaq ;
Senocak, Inanc .
JOURNAL OF COMPUTATIONAL PHYSICS, 2022, 463
[4]   Artificial intelligence velocimetry reveals in vivo flow rates, pressure gradients, and shear stresses in murine perivascular flows [J].
Boster, Kimberly A. S. ;
Cai, Shengze ;
Ladron-de-Guevara, Antonio ;
Sun, Jiatong ;
Zheng, Xiaoning ;
Du, Ting ;
Thomas, John H. ;
Nedergaard, Maiken ;
Karniadakis, George Em ;
Kelley, Douglas H. .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2023, 120 (14)
[5]   Physics-informed neural networks (PINNs) for fluid mechanics: a review [J].
Cai, Shengze ;
Mao, Zhiping ;
Wang, Zhicheng ;
Yin, Minglang ;
Karniadakis, George Em .
ACTA MECHANICA SINICA, 2021, 37 (12) :1727-1738
[6]  
Chen JR, 2020, Arxiv, DOI arXiv:2005.04554
[7]   A method for representing periodic functions and enforcing exactly periodic boundary conditions with deep neural networks [J].
Dong, Suchuan ;
Ni, Naxian .
JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 435
[8]  
Eskin V.A., 2023, arXiv
[9]  
Glorot X., 2010, J MACH LEARN RES, P249
[10]  
Guan WL, 2022, Arxiv, DOI arXiv:2210.13212