A recurrent neural network with predefined-time convergence and improved noise tolerance for dynamic matrix square root finding

被引:79
作者
Li, Weibing [1 ]
Liao, Bolin [1 ]
Xiao, Lin [1 ]
Lu, Rongbo [1 ]
机构
[1] Jishou Univ, Coll Informat Sci & Engn, Jishou 416000, Peoples R China
基金
中国国家自然科学基金; 湖南省自然科学基金;
关键词
Zeroing neural network; Dynamic square root finding; Predefined-time convergence; Noise tolerance; VARYING SYLVESTER EQUATION; FINITE-TIME; STABILIZATION;
D O I
10.1016/j.neucom.2019.01.072
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Zeroing neural network (ZNN, or termed Zhang neural network after its inventors) is an effective approach to dynamic matrix square root (DMSR) finding arising in numerous fields of science and engineering. The conventional ZNN models can obtain the theoretical DMSR in infinitely long time or in finite time. However, in some applications especially the ones that require to fulfill hard time constraints, these ZNN models may be not competent to guarantee a timely convergence. Hence, for solving DMSR, a ZNN model with explicitly and antecedently definable convergence time is more preferable. Being robust to external noises is very significant for a neural network model. Unfortunately, the existing ZNN models exhibit limited noise-tolerance capability and the corresponding steady-state residual errors would be theoretically bounded when the ZNN models are perturbed by dynamic bounded non-vanishing noises. To enhance the existing ZNN models, by using two novel activation functions, this paper for the first time enables the ZNN model to be predefined-time convergent with improved noise-tolerance capability. The convergence time of the accelerated ZNN model can be explicitly defined as a prior constant parameter. More importantly, such a predefined-time convergent ZNN (PTZNN) is capable of theoretically and completely enduring dynamic bounded vanishing and non-vanishing noises. For handling constant noises such as large constant model-implementation errors, the PTZNN can achieve improved noise-tolerance performance as compared with the existing ZNN models. Comparative simulation results demonstrate that the proposed PTZNN delivers superior convergence and robustness performance for solving DMSR in comparison with the existing ZNN models. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:262 / 273
页数:12
相关论文
共 33 条
[1]  
[Anonymous], 2008, Functions of matrices
[2]   Predefined-Time Convergence Control for High-Order Integrator Systems Using Time Base Generators [J].
Becerra, Hector M. ;
Vazquez, Carlos R. ;
Arechavaleta, Gustavo ;
Delfin, Josafat .
IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, 2018, 26 (05) :1866-1873
[3]   Efficient Adaptive Error Parameterizations for Square Root or Ensemble Kalman Filters: Application to the Control of Ocean Mesoscale Signals [J].
Brankart, Jean-Michel ;
Cosme, Emmanuel ;
Testut, Charles-Emmanuel ;
Brasseur, Pierre ;
Verron, Jacques .
MONTHLY WEATHER REVIEW, 2010, 138 (03) :932-950
[4]   Numerically stable improved Chebyshev-Halley type schemes for matrix sign function [J].
Cordero, Alicia ;
Soleymani, F. ;
Torregrosa, Juan R. ;
Ullah, M. Zaka .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2017, 318 :189-198
[5]  
Deadman E, 2013, LECT NOTES COMPUT SC, V7782, P171, DOI 10.1007/978-3-642-36803-5_12
[6]  
Haykin S., 1999, Neural Networks: A Comprehensive Foundation
[7]   Zeroing neural networks: A survey [J].
Jin, Long ;
Li, Shuai ;
Liao, Bolin ;
Zhang, Zhijun .
NEUROCOMPUTING, 2017, 267 :597-604
[8]   Finite-Time Stabilization of Homogeneous Non-Lipschitz Systems [J].
Khelil, Nawel ;
Otis, Martin J. -D. .
MATHEMATICS, 2016, 4 (04)
[9]   Symmetric quaternionic Hopfield neural networks [J].
Kobayashi, Masaki .
NEUROCOMPUTING, 2017, 240 :110-114
[10]   Accelerating a Recurrent Neural Network to Finite-Time Convergence for Solving Time-Varying Sylvester Equation by Using a Sign-Bi-power Activation Function [J].
Li, Shuai ;
Chen, Sanfeng ;
Liu, Bo .
NEURAL PROCESSING LETTERS, 2013, 37 (02) :189-205