Reservoir Computing for Solving Ordinary Differential Equations

被引:0
作者
Mattheakis, Marios [1 ]
Joy, Hayden [1 ]
Protopapas, Pavlos [1 ]
机构
[1] Harvard Univ, John A Paulson Sch Engn & Appl Sci, Cambridge, MA 02138 USA
关键词
Unsupervised learning; one-shot training; transfer learning; equation-driven model; NEURAL-NETWORKS;
D O I
10.1142/S0218213023500306
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There is a wave of interest in using physics-informed neural networks for solving differential equations. Most of the existing methods are based on feed-forward networks, while recurrent neural networks solvers have not been extensively explored. We introduce a reservoir computing (RC) architecture, an echo-state recurrent neural network capable of discovering approximate solutions that satisfy ordinary differential equations (ODEs). We suggest an approach to calculate time derivatives of recurrent neural network outputs without using back-propagation. The internal weights of an RC are fixed, while only a linear output layer is trained, yielding efficient training. However, RC performance strongly depends on finding the optimal hyper-parameters, which is a computationally expensive process. We use Bayesian optimization to discover optimal sets in a high-dimensional hyper-parameter space efficiently and numerically show that one set is robust and can be transferred to solve an ODE for different initial conditions and time ranges. A closed-form formula for the optimal output weights is derived to solve first-order linear equations in a one-shot backpropagation-free learning process. We extend the RC approach by solving nonlinear systems of ODEs using a hybrid optimization method consisting of gradient descent and Bayesian optimization. Evaluation of linear and nonlinear systems of equations demonstrates the efficiency of the RC ODE solver.
引用
收藏
页数:17
相关论文
共 46 条
  • [1] Balandat M., 2020, Advances in neural information processing systems, V33
  • [2] Learning data-driven discretizations for partial differential equations
    Bar-Sinai, Yohai
    Hoyer, Stephan
    Hickey, Jason
    Brenner, Michael P.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2019, 116 (31) : 15344 - 15349
  • [3] On learning Hamiltonian systems from data
    Bertalan, Tom
    Dietrich, Felix
    Mezic, Igor
    Kevrekidis, Ioannis G.
    [J]. CHAOS, 2019, 29 (12)
  • [4] Chang B, 2018, AAAI CONF ARTIF INTE, P2811
  • [5] Chen F., 2020, J. Open Sour. Softw., V5, P1931, DOI DOI 10.21105/JOSS.01931
  • [6] Chen R.T.Q., ADV NEURAL INFORM PR, V31, P6571
  • [7] Physics-informed neural networks for inverse problems in nano-optics and metamaterials
    Chen, Yuyao
    Lu, Lu
    Karniadakis, George Em
    Dal Negro, Luca
    [J]. OPTICS EXPRESS, 2020, 28 (08) : 11618 - 11633
  • [8] Physics-enhanced neural networks learn order and chaos
    Choudhary, Anshul
    Lindner, John F.
    Holliday, Elliott G.
    Miller, Scott T.
    Sinha, Sudeshna
    Ditto, William L.
    [J]. PHYSICAL REVIEW E, 2020, 101 (06)
  • [9] Dai ZX, 2020, Arxiv, DOI arXiv:2010.10154
  • [10] Dong J., 2020, ADV NEURAL INF PROCE, V33, P16785