Physics informed token transformer for solving partial differential equations

被引:4
作者
Lorsung, Cooper [1 ]
Li, Zijie [1 ]
Barati Farimani, Amir [1 ,2 ,3 ]
机构
[1] Carnegie Mellon Univ, Dept Mech Engn, Pittsburgh, PA 15289 USA
[2] Carnegie Mellon Univ, Dept Biomed Engn, Pittsburgh, PA 15289 USA
[3] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15289 USA
来源
MACHINE LEARNING-SCIENCE AND TECHNOLOGY | 2024年 / 5卷 / 01期
基金
美国国家科学基金会;
关键词
machine learning; neural operators; physics informed;
D O I
10.1088/2632-2153/ad27e3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Solving partial differential equations (PDEs) is the core of many fields of science and engineering. While classical approaches are often prohibitively slow, machine learning models often fail to incorporate complete system information. Over the past few years, transformers have had a significant impact on the field of Artificial Intelligence and have seen increased usage in PDE applications. However, despite their success, transformers currently lack integration with physics and reasoning. This study aims to address this issue by introducing Physics Informed Token Transformer (PITT). The purpose of PITT is to incorporate the knowledge of physics by embedding PDEs into the learning process. PITT uses an equation tokenization method to learn an analytically-driven numerical update operator. By tokenizing PDEs and embedding partial derivatives, the transformer models become aware of the underlying knowledge behind physical processes. To demonstrate this, PITT is tested on challenging 1D and 2D PDE operator learning tasks. The results show that PITT outperforms popular neural operator models and has the ability to extract physically relevant information from governing equations.
引用
收藏
页数:12
相关论文
共 41 条
  • [1] Brown TB, 2020, Arxiv, DOI [arXiv:2005.14165, DOI 10.48550/ARXIV.2005.14165]
  • [2] Bahdanau D, 2016, Arxiv, DOI arXiv:1409.0473
  • [3] Brandstetter Johannes, 2022, ICLR
  • [4] Cao SH, 2021, ADV NEUR IN, V34
  • [5] Devlin J, 2019, Arxiv, DOI [arXiv:1810.04805, DOI 10.48550/ARXIV.1810.04805]
  • [6] Dosovitskiy A, 2021, Arxiv, DOI arXiv:2010.11929
  • [7] Foucart C, 2022, Arxiv, DOI arXiv:2209.12351
  • [8] Super-resolution and denoising of fluid flow using physics-informed convolutional neural networks without high-resolution labels
    Gao, Han
    Sun, Luning
    Wang, Jian-Xun
    [J]. PHYSICS OF FLUIDS, 2021, 33 (07)
  • [9] Transformers for modeling physical systems
    Geneva, Nicholas
    Zabaras, Nicholas
    [J]. NEURAL NETWORKS, 2022, 146 : 272 - 289
  • [10] Guo RC, 2022, Arxiv, DOI arXiv:2209.14977