REGULARIZED NEURAL NETWORKS - SOME CONVERGENCE RATE RESULTS

被引:16
作者
CORRADI, V
WHITE, H
机构
[1] UNIV CALIF SAN DIEGO,DEPT ECON,SAN DIEGO,CA 92103
[2] INST NEURAL COMPUTAT,SAN DIEGO,CA
关键词
D O I
10.1162/neco.1995.7.6.1225
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In a recent paper, Poggio and Girosi (1990) proposed a class of neural networks obtained from the theory of regularization. Regularized networks are capable of approximating arbitrarily well any continuous function on a compactum. In this paper we consider in detail the learning problem for the one-dimensional case. We show that in the case of output data observed with noise, regularized networks are capable of learning and approximating (on compacta) elements of certain classes of Sobolev spaces, known as reproducing kernel Hilbert spaces (RKHS), at a nonparametric rate that optimally exploits the smoothness properties of the unknown mapping. In particular we show that the total squared error, given by the sum of the squared bias and the variance, will approach zero at a rate of n((-2m)/(2m+1)), where m denotes the order of differentiability of the true unknown function. On the other hand, if the unknown mapping is a continuous function but does not belong to an RKHS, then there still exists a unique regularized solution, but this is no longer guaranteed to converge in mean square to a well-defined limit. Further, even if such a solution converges, the total squared error is bounded away from zero for all n sufficiently large.
引用
收藏
页码:1225 / 1244
页数:20
相关论文
共 20 条
[11]  
JERRY AJ, 1985, INTRO INTEGRAL EQUAT
[12]  
Judge GeorgeG., 1985, THEORY PRACTICE ECON, V2nd ed.
[13]  
LUKAS MA, 1988, MATH COMPUT, V42, P107
[14]  
POGGIO T, 1990, BIOL CYBERN, V63, P169
[15]  
Riesz F., 1955, FUNCTIONAL ANAL
[16]  
STINCHCOMBE M, 1990, P INT JOINT C NEURAL, V3, P7
[17]  
TRICOMI FG, 1956, INTEGRAL EQUATIONS
[18]   PRACTICAL APPROXIMATE SOLUTIONS TO LINEAR OPERATOR EQUATIONS WHEN DATA ARE NOISY [J].
WAHBA, G .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 1977, 14 (04) :651-667
[19]  
Wahba G, 1990, SPLINE MODELS OBSERV
[20]   ON RADIAL BASIS FUNCTION NETS AND KERNAL REGRESSION - STATISTICAL CONSISTENCY, CONVERGENCE-RATES, AND RECEPTIVE-FIELD SIZE [J].
XU, L ;
KRZYZAK, A ;
YUILLE, A .
NEURAL NETWORKS, 1994, 7 (04) :609-628