Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks

被引:44
作者
Wang, Lisheng [1 ]
Xu, Zongben
机构
[1] Shanghai Jiao Tong Univ, Dept Automat, Shanghai 200030, Peoples R China
[2] Xi An Jiao Tong Univ, Res Ctr Sci & Res Appl Math, Xian 710049, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Sci, Inst Informat & Syst Sci, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
discrete-time recurrent neural networks; exponential bound; global asymptotic stability (GAS); global exponential stability (GES); nonlinear difference equations;
D O I
10.1109/TCSI.2006.874179
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A set of sufficient and necessary conditions are presented for global exponential stability (GES) of a class of generic discrete-time recurrent neural networks. By means of the uncovered conditions, GES and convergence properties of the neural networks are analyzed quantitatively. It is shown that exact equivalences exist among the GES property of the neural networks, the contractiveness of the deduced nonlinear operators, and the global asymptotic stability (GAS) of the neural networks plus the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point less than one. When the neural networks have small state feedback coefficients, it is shown further that the infimum of exponential bounds of the trajectories of the neural networks equals exactly the spectral radius of Jacobian matrix of the neural networks at the unique equilibrium point. The obtained results are helpful in understanding essence of GES and clarifying difference between GES and GAS of the discrete-time recurrent neural networks.
引用
收藏
页码:1373 / 1380
页数:8
相关论文
共 33 条
[21]   A CONVERSE TO BANACHS CONTRACTION THEOREM [J].
MEYERS, PR .
JOURNAL OF RESEARCH OF THE NATIONAL BUREAU OF STANDARDS SECTION B-MATHEMATICAL SCIENCES, 1967, B 71 (2-3) :73-+
[22]  
OIIOHIIEB BH, 1976, YMH, V31, P169
[23]   A new approach to stability of neural networks with time-varying delays [J].
Peng, JG ;
Qiao, H ;
Xu, ZB .
NEURAL NETWORKS, 2002, 15 (01) :95-103
[24]   Translational order in liquid-expanded lipid monolayers functionalized with nucleosides [J].
Perez, E ;
Pincet, F ;
Goldmann, M ;
Mioskowski, C ;
Lebeau, L .
EUROPEAN PHYSICAL JOURNAL B, 1998, 6 (01) :1-4
[25]   A reference model approach to stability analysis of neural networks [J].
Qiao, H ;
Peng, J ;
Xu, ZB ;
Zhang, B .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2003, 33 (06) :925-936
[26]   Nonlinear measures: A new approach to exponential stability analysis for Hopfield-type neural networks [J].
Qiao, H ;
Peng, J ;
Xu, ZB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (02) :360-370
[27]   Global exponential stability of discrete-time neural networks for constrained quadratic optimization [J].
Tan, KC ;
Tang, HJ ;
Yi, Z .
NEUROCOMPUTING, 2004, 56 :399-406
[28]   Global exponential asymptotic stability in nonlinear discrete dynamical systems [J].
Wang, LS ;
Heng, PA ;
Leung, KS ;
Xu, ZB .
JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2001, 258 (01) :349-358
[29]   On convergence rate of projection neural networks [J].
Xia, YS ;
Feng, G .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2004, 49 (01) :91-96
[30]   Multistability of discrete-time recurrent neural networks with unsaturating piecewise linear activation functions [J].
Yi, Z ;
Tan, KK .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (02) :329-336