Event-triggered H∞ state estimation for semi-Markov jumping discrete-time neural networks with quantization

被引:52
作者
Rakkiyappan, R. [1 ]
Maheswari, K. [2 ]
Velmurugan, G. [1 ]
Park, Ju H. [3 ,4 ]
机构
[1] Bharathiar Univ, Dept Math, Coimbatore 641046, Tamil Nadu, India
[2] Kumaraguru Coll Technol, Dept Math, Coimbatore 641049, Tamil Nadu, India
[3] Chongqing Normal Univ, Sch Math Sci, Chongqing 401331, Peoples R China
[4] Yeungnam Univ, Dept Elect Engn, 280 Daehak Ro, Kyongsan 38541, South Korea
基金
新加坡国家研究基金会;
关键词
Semi-Markov jump neural networks; Exponential stability; H-infinity control; Event-trigger scheme; Quantization; EXPONENTIAL STABILITY; DIFFERENTIAL-EQUATIONS; MULTIAGENT SYSTEMS; LINEAR-SYSTEMS; SYNCHRONIZATION; APPROXIMATION; STABILIZATION; PARAMETERS; OPERATORS; DESIGN;
D O I
10.1016/j.neunet.2018.05.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper investigates H-infinity state estimation problem for a class of semi-Markovian jumping discrete-time neural networks model with event-triggered scheme and quantization. First, a new event-triggered communication scheme is introduced to determine whether or not the current sampled sensor data should be broad-casted and transmitted to the quantizer, which can save the limited communication resource. Second, a novel communication framework is employed by the logarithmic quantizer that quantifies and reduces the data transmission rate in the network, which apparently improves the communication efficiency of networks. Third, a stabilization criterion is derived based on the sufficient condition which guarantees a prescribed H-infinity performance level in the estimation error system in terms of the linear matrix inequalities. Finally, numerical simulations are given to illustrate the correctness of the proposed scheme. (C) 2018 Elsevier Ltd. All rights reserved.
引用
收藏
页码:236 / 248
页数:13
相关论文
共 64 条
[1]  
[Anonymous], IEEE T NEURAL NETWOR
[2]  
[Anonymous], 2017, WILEY INTERDISCIPLIN
[3]  
[Anonymous], 1993, Neural networks for optimization and signal processing
[4]   An analysis of exponential stability of delayed neural networks with time varying delays [J].
Arik, S .
NEURAL NETWORKS, 2004, 17 (07) :1027-1031
[5]  
Bishop C.M., 1995, Neural networks for pattern recognition
[6]   The estimate for approximation error of neural networks: A constructive approach [J].
Cao, Feilong ;
Xie, Tingfan ;
Xu, Zongben .
NEUROCOMPUTING, 2008, 71 (4-6) :626-630
[7]   Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays [J].
Cao, JD ;
Wang, J .
NEURAL NETWORKS, 2004, 17 (03) :379-390
[8]   Exponential H a filtering analysis for discrete-time switched neural networks with random delays using sojourn probabilities [J].
Cao JinDe ;
Rakkiyappan, R. ;
Maheswari, K. ;
Chandrasekar, A. .
SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2016, 59 (03) :387-402
[9]   NEURAL NETWORK MODELS FOR PATTERN-RECOGNITION AND ASSOCIATIVE MEMORY [J].
CARPENTER, GA .
NEURAL NETWORKS, 1989, 2 (04) :243-257
[10]   Scattered data approximation by neural networks operators [J].
Chen, Zhixiang ;
Cao, Feilong .
NEUROCOMPUTING, 2016, 190 :237-242