New necessary and sufficient conditions for absolute stability of neural networks

被引:16
作者
Chu, Tianguang [1 ]
Zhang, Cishen
机构
[1] Peking Univ, Intelligent Control Lab, Ctr Syst & Control, Dept Mech & Engn Sci, Beijing 100871, Peoples R China
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[3] Nanyang Technol Univ, Sch Chem & Biomed Engn, Singapore 639798, Singapore
基金
中国国家自然科学基金;
关键词
absolute stability; asymmetric connection; exponential convergence; global asymptotic stability; neural networks; solvable lie algebra condition;
D O I
10.1016/j.neunet.2006.06.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents new necessary and sufficient conditions for absolute stability of asymmetric neural networks. The main result is based on a solvable Lie algebra condition, which generalizes existing results for symmetric and normal neural networks. An exponential convergence estimate of the neural networks is also obtained. Further, it is demonstrated how to generate larger sets of weight matrices for absolute stability of the neural networks from known normal weight matrices through simple procedures. The approach is nontrivial in the sense that non-normal matrices can possibly be contained in the resulting weight matrix set. And the results also provide finite checking for robust stability of neural networks in the presence of parameter uncertainties. (c) 2006 Elsevier Ltd. All rights reserved.
引用
收藏
页码:94 / 101
页数:8
相关论文
共 27 条