Neuroevolution of Physics-Informed Neural Nets: Benchmark Problems and Comparative Results

被引:3
作者
Sung, Nicholas Wei Yong [1 ]
Wong, Jian Cheng [2 ,3 ]
Ooi, Chin Chun [1 ,2 ]
Gupta, Abhishek [4 ]
Chiu, Pao-Hsiung [2 ]
Ong, Yew-Soon [3 ,5 ]
机构
[1] Agcy Sci Technol & Res, Ctr Frontier AI Res, Singapore, Singapore
[2] Agcy Sci Technol & Res, Inst High Performance Comp, Singapore, Singapore
[3] Nanyang Technol Univ, Singapore, Singapore
[4] Agcy Sci Technol & Res, Singapore Inst Mfg Technol, Singapore, Singapore
[5] Agcy Sci Technol & Res, Singapore, Singapore
来源
PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION | 2023年
关键词
Neuroevolution; stochastic gradient descent; physics-informed neural networks; benchmarks; NETWORKS;
D O I
10.1145/3583133.3596397
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The potential of learned models for fundamental scientific research and discovery is drawing increasing attention worldwide. Physics-informed neural networks (PINNs), where the loss function directly embeds governing equations of scientific phenomena, is one of the key techniques at the forefront of recent advances. PINNs are typically trained using stochastic gradient descent methods, akin to their deep learning counterparts. However, analysis in this paper shows that PINNs' unique loss formulations lead to a high degree of complexity and ruggedness that may not be conducive for gradient descent. Unlike in standard deep learning, PINN training requires globally optimum parameter values that satisfy physical laws as closely as possible. Spurious local optimum, indicative of erroneous physics, must be avoided. Hence, neuroevolution algorithms, with their superior global search capacity, may be a better choice for PINNs relative to gradient descent methods. Here, we propose a set of five benchmark problems, with open-source codes, spanning diverse physical phenomena for novel neuroevolution algorithm development. Using this, we compare two neuroevolution algorithms against the commonly used stochastic gradient descent, and our baseline results support the claim that neuroevolution can surpass gradient descent, ensuring better physics compliance in the predicted outputs.
引用
收藏
页码:2144 / 2151
页数:8
相关论文
共 23 条
  • [1] Abadi M., 2019, CoRR abs/1603.04467. arXiv: 1603 . 04467, DOI DOI 10.48550/ARXIV.1603.04467
  • [2] An improved divergence-free-condition compensated method for solving incompressible flows on collocated grids
    Chiu, Pao-Hsiung
    [J]. COMPUTERS & FLUIDS, 2018, 162 : 39 - 54
  • [3] MOPINNs: An Evolutionary Multi-Objective Approach to Physics-Informed Neural Networks
    de Wolff, Taco
    Carrillo Lincopi, Hugo
    Marti, Luis
    Sanchez-Pi, Nayat
    [J]. PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2022, 2022, : 228 - 231
  • [4] Garipov T, 2018, ADV NEUR IN, V31
  • [5] Gopakumar V, 2022, Arxiv, DOI arXiv:2205.07843
  • [6] Hansen N, 2006, STUD FUZZ SOFT COMP, V192, P75
  • [7] Hansen Nikolaus, 2016, arXiv
  • [8] Physics-informed machine learning
    Karniadakis, George Em
    Kevrekidis, Ioannis G.
    Lu, Lu
    Perdikaris, Paris
    Wang, Sifan
    Yang, Liu
    [J]. NATURE REVIEWS PHYSICS, 2021, 3 (06) : 422 - 440
  • [9] Krishnapriyan Aditi, 2021, Advances in Neural Information Processing Systems, V34, P265
  • [10] THE ULTIMATE CONSERVATIVE DIFFERENCE SCHEME APPLIED TO UNSTEADY ONE-DIMENSIONAL ADVECTION
    LEONARD, BP
    [J]. COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 1991, 88 (01) : 17 - 74