Complete Stability of Delayed Recurrent Neural Networks With New Wave-Type Activation Functions

被引:2
作者
Yan, Zepeng [1 ]
Sun, Wen [1 ]
Guo, Wanli [2 ]
Li, Biwen [1 ]
Wen, Shiping [3 ]
Cao, Jinde [4 ,5 ,6 ]
机构
[1] Hubei Normal Univ, Sch Math & Stat, Huangshi Key Lab Metaverse & Virtual Simulat, Huangshi 435002, Hubei, Peoples R China
[2] China Univ Geosci, Sch Math & Phys, Wuhan 430074, Peoples R China
[3] Univ Technol Sydney, Australian Artificial Intelligence Inst, Ultimo, NSW 2007, Australia
[4] Southeast Univ, Sch Math, Nanjing 211189, Peoples R China
[5] Purple Mt Labs, Nanjing 211111, Peoples R China
[6] Ahlia Univ, Manama 10878, Bahrain
基金
中国国家自然科学基金;
关键词
Artificial neural networks; Numerical stability; Stability criteria; Neurons; Delays; Sun; Thermal stability; Complete stability; delayed recurrent neural network (DRNN); time-varying delay; wave-type activation function; ASSOCIATIVE MEMORY; MULTISTABILITY; MULTIPLE;
D O I
10.1109/TNNLS.2024.3394854
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Activation functions have a significant effect on the dynamics of neural networks (NNs). This study proposes new nonmonotonic wave-type activation functions and examines the complete stability of delayed recurrent NNs (DRNNs) with these activation functions. Using the geometrical properties of the wave-type activation function and subsequent iteration scheme, sufficient conditions are provided to ensure that a DRNN with n neurons has exactly (2m+3)(n) equilibria, where (m+2)(n) equilibria are locally exponentially stable, the remainder (2m+3)(n)-(m+ 2)(n) equilibria are unstable, and a positive integer m is related to wave-type activation functions. Furthermore, the DRNN with the proposed activation function is completely stable. Compared with the previous literature, the total number of equilibria and the stable equilibria significantly increase, thereby enhancing the memory storage capacity of DRNN. Finally, several examples are presented to demonstrate our proposed results.
引用
收藏
页码:6584 / 6596
页数:13
相关论文
共 46 条
[1]   H∞ state estimation of stochastic memristor-based neural networks with time-varying delays [J].
Bao, Haibo ;
Cao, Jinde ;
Kurths, Juergen ;
Alsaedi, Ahmed ;
Ahmad, Bashir .
NEURAL NETWORKS, 2018, 99 :79-91
[2]   Multistability of State-Dependent Switched Fractional-Order Hopfield Neural Networks With Mexican-Hat Activation Function and Its Application in Associative Memories [J].
Cao, Boqiang ;
Nie, Xiaobing ;
Zheng, Wei Xing ;
Cao, Jinde .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) :1213-1227
[3]   Stability analysis of delayed cellular neural networks [J].
Cao, JD ;
Zhou, DM .
NEURAL NETWORKS, 1998, 11 (09) :1601-1605
[4]   Fixed-time synchronization of delayed memristor-based recurrent neural networks [J].
Cao, Jinde ;
Li, Ruoxia .
SCIENCE CHINA-INFORMATION SCIENCES, 2017, 60 (03)
[5]   Multistability in recurrent neural networks [J].
Cheng, Chang-Yuan ;
Lin, Kuang-Hui ;
Shih, Chih-Wen .
SIAM JOURNAL ON APPLIED MATHEMATICS, 2006, 66 (04) :1301-1320
[6]   Multistability for Delayed Neural Networks via Sequential Contracting [J].
Cheng, Chang-Yuan ;
Lin, Kuang-Hui ;
Shih, Chih-Wen ;
Tseng, Jui-Pin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (12) :3109-3122
[7]   Multistability of Dynamic Memristor Delayed Cellular Neural Networks With Application to Associative Memories [J].
Deng, Kun ;
Zhu, Song ;
Bao, Gang ;
Fu, Jun ;
Zeng, Zhigang .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (02) :690-702
[8]   New Conditions for Global Asymptotic Stability of Memristor Neural Networks [J].
Di Marco, Mauro ;
Forti, Mauro ;
Pancioni, Luca .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) :1822-1834
[9]   Robust stability analysis of a class of neural networks with discrete time delays [J].
Faydasicok, Ozlem ;
Arik, Sabri .
NEURAL NETWORKS, 2012, 29-30 :52-59
[10]   Multistability of Switched Neural Networks With Gaussian Activation Functions Under State-Dependent Switching [J].
Guo, Zhenyuan ;
Ou, Shiqin ;
Wang, Jun .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) :6569-6583