Absolute exponential stability of neural networks with a general class of activation functions

被引:58
|
作者
Liang, XB
Wang, J
机构
[1] Fudan Univ, Dept Comp Sci, Shanghai 200433, Peoples R China
[2] Chinese Univ Hong Kong, Dept Mech & Automat Engn, Shatin, Hong Kong, Peoples R China
来源
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-FUNDAMENTAL THEORY AND APPLICATIONS | 2000年 / 47卷 / 08期
关键词
absolute exponential stability; activation functions; global exponential stability; H-matrix; neural networks; partial Lipschitz continuity;
D O I
10.1109/81.873882
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This brief investigates the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous (defined in Section II) and monotone increasing activation functions. The main obtained result is that if the interconnection matrix T of the the network system satisfies that -T is an H-matrix with nonnegative diagonal elements, then the neural network system is absolutely exponentially stable (AEST); i.e., that the network system is globally exponentially stable (GES) for any activation functions in the above class, any constant input vectors and any of her network parameters. The obtained AEST result extends the existing ones of absolute stability (ABST) of neural networks with special classes of activation functions in the literature.
引用
收藏
页码:1258 / 1263
页数:6
相关论文
共 50 条