Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks

被引:44
作者
Wang, Lisheng [1 ]
Xu, Zongben
机构
[1] Shanghai Jiao Tong Univ, Dept Automat, Shanghai 200030, Peoples R China
[2] Xi An Jiao Tong Univ, Res Ctr Sci & Res Appl Math, Xian 710049, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Sci, Inst Informat & Syst Sci, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
discrete-time recurrent neural networks; exponential bound; global asymptotic stability (GAS); global exponential stability (GES); nonlinear difference equations;
D O I
10.1109/TCSI.2006.874179
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A set of sufficient and necessary conditions are presented for global exponential stability (GES) of a class of generic discrete-time recurrent neural networks. By means of the uncovered conditions, GES and convergence properties of the neural networks are analyzed quantitatively. It is shown that exact equivalences exist among the GES property of the neural networks, the contractiveness of the deduced nonlinear operators, and the global asymptotic stability (GAS) of the neural networks plus the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point less than one. When the neural networks have small state feedback coefficients, it is shown further that the infimum of exponential bounds of the trajectories of the neural networks equals exactly the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point. The obtained results are helpful in understanding essence of GES and clarifying difference between GES and GAS of the discrete-time recurrent neural networks.
引用
收藏
页码:1373 / 1380
页数:8
相关论文
empty
未找到相关数据