discrete-time recurrent neural networks;
exponential bound;
global asymptotic stability (GAS);
global exponential stability (GES);
nonlinear difference equations;
D O I:
10.1109/TCSI.2006.874179
中图分类号:
TM [电工技术];
TN [电子技术、通信技术];
学科分类号:
0808 ;
0809 ;
摘要:
A set of sufficient and necessary conditions are presented for global exponential stability (GES) of a class of generic discrete-time recurrent neural networks. By means of the uncovered conditions, GES and convergence properties of the neural networks are analyzed quantitatively. It is shown that exact equivalences exist among the GES property of the neural networks, the contractiveness of the deduced nonlinear operators, and the global asymptotic stability (GAS) of the neural networks plus the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point less than one. When the neural networks have small state feedback coefficients, it is shown further that the infimum of exponential bounds of the trajectories of the neural networks equals exactly the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point. The obtained results are helpful in understanding essence of GES and clarifying difference between GES and GAS of the discrete-time recurrent neural networks.