Global Lagrange Stability of Recurrent Neural Networks with Infinity Distributed Delays

被引:0
作者
Wang, Xiaohong [1 ]
Zhao, Xuhui [1 ]
Pu, Jiexin [1 ]
Bu, Wenshao [1 ]
Chen, Xingjun [2 ]
机构
[1] Henan Univ Sci & Technol, Coll Informat Engn, Luoyang 471023, Peoples R China
[2] Dalian Naval Acad, Dept Sci Res, Dalian 116018, Peoples R China
来源
2014 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (ICIA) | 2014年
关键词
Recurrent Neural network; global exponential stability in Lagrange sense; Globally exponentially attractive set; infinity distributed delays; Linear matrix inequalities; EXPONENTIAL STABILITY; MIXED DELAYS; DISSIPATIVITY; SENSE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper is concerned with global exponential Lagrange stability for recurrent neural networks (RNNs) with general activation functions and infinity distributed delays. By employing a new differential inequality and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are derived in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in Matlab. Moreover, detailed estimations of the globally exponentially attractive sets are given out. Compared with the previous methods, the results obtained are independent of the time-varying delays and do not require the differentiability of delay functions. They extend and improve the earlier publications. Finally, a numerical example is provided to demonstrate the potential effectiveness of the proposed results.
引用
收藏
页码:624 / 629
页数:6
相关论文
共 27 条