CONSISTENCY OF MULTILAYER PERCEPTRON REGRESSION-ESTIMATORS

被引:25
|
作者
MIELNICZUK, J
TYRCHA, J
机构
关键词
MULTILAYER PERCEPTRON; LEAST SQUARES REGRESSION ESTIMATOR; ENTROPY; BACK PROPAGATION; VAPNIK-CHERVONENKIS CLASS;
D O I
10.1016/S0893-6080(09)80011-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the paper three layer perceptron with one hidden layer and the output layer consisting of one neuron is considered. This is commonly used architecture to solve regression problems where such a perceptron minimizing the mean squared error criterion for the data points x(k), y(k)), k = 1, .... N is sought. It is shown that in the model: y(k) = g0(X(k)) + epsilon(k), k = 1, .... N, where x(k) is independent from zero mean error term epsilon(k), this procedure is consistent when N --> infinity, provided that g0 is represented as three layer perceptron with Heaviside transfer fucntion. The same result is true when transfer function is an arbitrary continuous function with bounded limits at +/- infinity and the hidden-lo-output weights in the considered family of perceptrons are bounded.
引用
收藏
页码:1019 / 1022
页数:4
相关论文
共 50 条