Global convergence of training methods for neural networks based on the state-estimation

被引:0
作者
Tsumura, T [1 ]
Tatsumi, K [1 ]
Tanino, T [1 ]
机构
[1] IBM Corp, Tokyo Res Lab, Tokyo, Japan
来源
SICE 2003 ANNUAL CONFERENCE, VOLS 1-3 | 2003年
关键词
neural networks; extend Kalman filter; extend H infinity filter; training method; global convergence; simplified method;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Although the EKF (Extended Kalman Filter) has been widely used as a training method for neural networks (NN), it is known to have a poor robustness to disturbances. Recently an EHF(Extended H-infinity Filter)-based training method was proposed, which is improved in the robustness to the nature of noises. However, its convergence property is not yet known. In this paper, we show that EHF-based method can be regarded as a minimization method of the least square problem and that it has the deterministic global convergence property. Moreover, we propose a new simplified method for EKF or EHF-based methods for NN and verify the efficiency of the proposed method.
引用
收藏
页码:1266 / 1271
页数:6
相关论文
共 8 条
  • [1] Incremental least squares methods and the extended Kalman filter
    Bertsekas, DP
    [J]. SIAM JOURNAL ON OPTIMIZATION, 1996, 6 (03) : 807 - 822
  • [2] DOUGLAS SC, 1991, P IJCNN, V1, P307
  • [3] MORIYAMA H, 2001, 2001002 KYOT U DEP A
  • [4] H∞-learning of layered neural networks
    Nishiyama, K
    Suzuki, K
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (06): : 1265 - 1277
  • [5] PUSKORIUS GV, 1991, P INT JOINT C NEUR N, V1, P771
  • [6] SHAH S, 1990, IJCNN INT JOINT C NE, V3, P1265
  • [7] SINGHAL S, 1989, ICASSP 89 1989 INT C, V2, P1887
  • [8] A local linearized least squares algorithm for training feedforward neural networks
    Stan, O
    Kamen, E
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (02): : 487 - 495