Physical activation functions (PAFs): An approach for more efficient induction of physics into physics-informed neural networks (PINNs)

被引:5
作者
Abbasi, Jassem [1 ]
Andersen, Pal Ostebo [1 ]
机构
[1] Univ Stavanger, Dept Energy Resources, N-4036 Stavanger, Norway
关键词
Deep Learning; Physics-informed Neural Networks; PDEs; Activation Functions;
D O I
10.1016/j.neucom.2024.128352
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, the evolution of Physics-Informed Neural Networks (PINNs) has reduced the gap between Deep Learning (DL) based methods and analytical/numerical approaches in scientific computing. However, there are still complications in training PINNs and the optimal interleaving of physical models. In this work, we introduce the concept of Physical Activation Functions (PAFs), in which one can use generic AFs with their mathematical expression inherited from the physical description of the evaluated system, instead of solely usage of standard activation functions (AFs) such as tanh, and sigmoid for all the neurons. The expression of PAFs could be selected based on individual terms appearing in the analytical solution, the initial or boundary conditions of the PDE system, or a component in the composition-of-functions type solutions. The PAFs could be applied in NNs, either in explicit, or self-adaptive forms. In the explicit approach, the main activation function of the network is replaced by PAF in some of the neurons of the network. In the self-adaptive implementation approach, the relative impact of PAFs (compared to the base AF) for each neuron was determined automatically. We tested the performance of PAFs in both forward and inverse problems for several PDEs, such as 1D and 2D wave equations, the Advection-Convection equation, the 1D heterogeneous, and 2D diffusion equations, and the Laplace equation. The main advantage of PAFs, compared to using standard AFs, was the more efficient constraining and interleaving of PINNs with the physical phenomena and their underlying mathematical models. The added PAFs significantly improved the predictions of PINNs for the testing data that were out-of-training distribution. Furthermore, applying PAFs reduced the size of the PINNs by up to 75 % in different cases while maintaining the same accuracy. Also, the training process was improved by reducing the value of the total loss term by one to two orders of magnitude. Furthermore, it improved the precision of the calculated properties in the examined inverse problems, for both clean and noisy observational data. It can be concluded that using PAFs helps in generating PINNs with less complexity and more validity for longer ranges of prediction.
引用
收藏
页数:16
相关论文
共 42 条
  • [1] Simulation and Prediction of Countercurrent Spontaneous Imbibition at Early and Late Time Using Physics-Informed Neural Networks
    Abbasi, Jassem
    Andersen, Pal ostebo
    [J]. ENERGY & FUELS, 2023, 37 (18) : 13721 - 13733
  • [2] Prediction of porous media fluid flow using physics informed neural networks
    Almajid, Muhammad M.
    Abu-Al-Saud, Moataz O.
    [J]. JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING, 2022, 208
  • [3] Bergman C., 2011, UNIVERSAL ALGEBRA FU
  • [4] Neural network method for solving parabolic two-temperature microscale heat conduction in double-layered thin films exposed to ultrashort-pulsed lasers
    Bora, Aniruddha
    Dai, Weizhong
    Wilson, Joshua P.
    Boyt, Jacob C.
    [J]. INTERNATIONAL JOURNAL OF HEAT AND MASS TRANSFER, 2021, 178
  • [5] Physics-informed neural networks (PINNs) for fluid mechanics: a review
    Cai, Shengze
    Mao, Zhiping
    Wang, Zhicheng
    Yin, Minglang
    Karniadakis, George Em
    [J]. ACTA MECHANICA SINICA, 2021, 37 (12) : 1727 - 1738
  • [6] The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem
    Colbrook, Matthew J.
    Antun, Vegard
    Hansen, Anders C.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2022, 119 (12)
  • [7] Cranmer M, 2020, Arxiv, DOI arXiv:2006.11287
  • [8] Cuomo S., 2022, arXiv
  • [9] Djeumou F, 2021, Arxiv, DOI arXiv:2109.06407
  • [10] Evans Lawrence C., 2022, Partial differential equations, V19