Specification of training sets and the number of hidden neurons for multilayer perceptrons

被引:27
作者
Camargo, LS [1 ]
Yoneyama, T [1 ]
机构
[1] Inst Tecnol Aeronaut, Sao Jose Dos Campos, Brazil
关键词
D O I
10.1162/089976601317098484
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work concerns the selection of input-output pairs for improved training of multilayer perceptrons, in the context of approximation of univariate real functions. A criterion for the choice of the number of neurons in the hidden layer is also provided. The main idea is based on the fact that Chebyshev polynomials can provide approximations to bounded functions up to a prescribed tolerance, and, in turn, a polynomial of a certain order can be fitted with a three-layer perceptron with a prescribed number of hidden neurons. The results are applied to a sensor identification example.
引用
收藏
页码:2673 / 2680
页数:8
相关论文
共 19 条
[1]  
Atkinson K. E., 1989, INTRO NUMERICAL ANAL
[2]   UNIVERSAL APPROXIMATION BOUNDS FOR SUPERPOSITIONS OF A SIGMOIDAL FUNCTION [J].
BARRON, AR .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1993, 39 (03) :930-945
[3]  
BARRON AR, 1994, MACH LEARN, V14, P115, DOI 10.1007/BF00993164
[4]   A method for approximating one-dimensional functions [J].
Basios, V ;
Bonushkina, AY ;
Ivanov, VV .
COMPUTERS & MATHEMATICS WITH APPLICATIONS, 1997, 34 (7-8) :687-693
[5]  
Burden R. L., 1985, Numerical Analysis
[6]  
CHEN TP, 1995, IEEE T NEURAL NETWOR, V6, P25
[7]  
CHINTALAPUDI K, 1998, P IEEE INT C SYST MA
[8]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[9]  
Doeblin E. O., 1990, MEASUREMENT SYSTEMS
[10]   Even on finite test sets smaller nets may perform better [J].
Elsken, T .
NEURAL NETWORKS, 1997, 10 (02) :369-385