Total least squares for block training of neural networks

被引:0
|
作者
Navia-Vázquez, A [1 ]
Figueiras-Vidal, AR [1 ]
机构
[1] Univ Carlos III Madrid, ATSC DTC, Leganes Madrid 28911, Spain
关键词
perceptron; noise; total least-squares; block training;
D O I
10.1016/S0925-2312(99)00107-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is intended to be a contribution to the better understanding and to the improvement of training methods for neural networks. Instead of the classical gradient descent approach, we adopt another point of view in terms of block least-squares minimizations, finally leading to the inclusion of total least-squares methods into the learning framework. We propose a training method for multilayer perceptrons which combines a reduced computational cost (attributed to block methods in general), a procedure for correcting the well-known sensitivity problems of these approaches, and the layer-wise application of a total least-squares algorithm (high resistance against noise in the data). The new method, which we call reduced sensitivity total least-squares (RS-TLS) training, demonstrates good performance in practical applications. (C) 1999 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:213 / 217
页数:5
相关论文
共 50 条