Bounds of the induced norm and model reduction errors for systems with repeated scalar nonlinearities

被引:93
作者
Chu, YC [1 ]
Glover, K [1 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
关键词
balanced model reduction; diagonal stability; diagonally dominant matrices; linear fractional transformations; linear matrix inequalities; recurrent neural networks;
D O I
10.1109/9.751342
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The class of nonlinear systems described by a discrete-time state equation containing a repeated scalar nonlinearity as in recurrent neural networks is considered. Sufficient conditions are derived for the stability and induced norm of such systems using positive definite diagonally dominant Lyapunov functions or storage functions, satisfying appropriate linear matrix inequalities. Results are also presented for model reduction errors for such systems.
引用
收藏
页码:471 / 483
页数:13
相关论文
共 26 条
  • [1] STATE OBSERVABILITY IN RECURRENT NEURAL NETWORKS
    ALBERTINI, F
    SONTAG, ED
    [J]. SYSTEMS & CONTROL LETTERS, 1994, 22 (04) : 235 - 244
  • [2] FORWARD ACCESSIBILITY FOR RECURRENT NEURAL NETWORKS
    ALBERTINI, F
    PRA, PD
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1995, 40 (11) : 1962 - 1968
  • [3] FOR NEURAL NETWORKS, FUNCTION DETERMINES FORM
    ALBERTINI, F
    SONTAG, ED
    [J]. NEURAL NETWORKS, 1993, 6 (07) : 975 - 990
  • [4] ALBERTINI F, 1993, P INT S MATH THEOR N, V2, P599
  • [5] ALBERTINI F, 1993, P EUR CONTR C GRON, P460
  • [6] Albertini F., 1993, ARTIFICIAL NEURAL NE, P113
  • [7] [Anonymous], STUDIES APPL MATH
  • [8] BECK C, 1996, P 13 TRIENN IFAC WOR, P291
  • [9] BERAN E, RECENT ADV LMI METHO
  • [10] Boyd S, 1994, STUDIES APPL MATH, V15