Predicting Output Responses of Nonlinear Dynamical Systems With Parametrized Inputs Using LSTM

被引:4
作者
Feng, Lihong [1 ]
机构
[1] Max Planck Inst Dynam Complex Tech Syst, D-39106 Magdeburg, Germany
关键词
Logic gates; Standards; Machine learning; Training data; Training; Dimensionality reduction; Recurrent neural networks; Scientific machine learning; long short-term memory (LSTM); dynamical systems with inputs and outputs; output prediction; NEURAL-NETWORKS;
D O I
10.1109/JMMCT.2023.3242044
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Long Short-Term Memory (LSTM) has been more and more used to predict time evolution of dynamics for many problems, especially the fluid dynamics. Usually, it is applied to the latent space after dimension reduction of the full dynamical system by proper orthogonal decomposition (POD), autoencoder (AE) or convolutional autoencoder (CAE). In this work, we propose to directly apply LSTM to the data of the output without dimension reduction for output response prediction. The dimension of the output is usually small, and no dimension reduction is necessary, thus no accuracy loss is caused by dimension reduction. Based on the standard LSTM structure, we propose an LSTM network with modified activation functions which is shown to be much more robust for predicting periodic waveforms. We are especially interested in showing the efficiency of LSTM for predicting the output responses corresponding to time-vary input signals, which is rarely considered in the literature. However, such systems are of great interests in electrical engineering, mechanical engineering, and control engineering, etc. Numerical results for models from circuit simulation, neuron science and a electrochemical reaction have shown the efficiency of LSTM in predicting the dynamics of output responses.
引用
收藏
页码:97 / 107
页数:11
相关论文
共 25 条
[1]  
Benner P., 2021, MODEL ORDER REDUCTIO, V1
[2]   A Survey of Projection-Based Model Reduction Methods for Parametric Dynamical Systems [J].
Benner, Peter ;
Gugercin, Serkan ;
Willcox, Karen .
SIAM REVIEW, 2015, 57 (04) :483-531
[3]  
Berzins A, 2021, Arxiv, DOI arXiv:2006.13706
[4]  
Bhattacharya K., 2021, SMAI J COMPUT MATH, V7, P121, DOI DOI 10.5802/SMAI-JCM.74
[5]  
Chen Yong., 1999, Model order reduction for nonlinear systems
[6]   IMPULSES AND PHYSIOLOGICAL STATES IN THEORETICAL MODELS OF NERVE MEMBRANE [J].
FITZHUGH, R .
BIOPHYSICAL JOURNAL, 1961, 1 (06) :445-&
[7]   A Comprehensive Deep Learning-Based Approach to Reduced Order Modeling of Nonlinear Time-Dependent Parametrized PDEs [J].
Fresca, Stefania ;
Dede', Luca ;
Manzoni, Andrea .
JOURNAL OF SCIENTIFIC COMPUTING, 2021, 87 (02)
[8]  
Gers F., 2001, Long short-term memory in recurrent neural networks
[9]  
Graves A, 2012, STUD COMPUT INTELL, V385, P1, DOI [10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
[10]   LSTM: A Search Space Odyssey [J].
Greff, Klaus ;
Srivastava, Rupesh K. ;
Koutnik, Jan ;
Steunebrink, Bas R. ;
Schmidhuber, Juergen .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (10) :2222-2232