Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training

被引:10
作者
Mahdi, Rami N. [1 ]
Rouchka, Eric Christian [2 ]
机构
[1] Weill Cornell Med Coll, Dept Med Genet, New York, NY 10065 USA
[2] Univ Louisville, Dept Comp Engn & Comp Sci, Louisville, KY 40292 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 05期
关键词
Bridge regression; generalized RBF; HyperBF; localized dimension reduction; reduced HyperBF; regularized HyperBF; weight decay; NEURAL-NETWORKS; APPROXIMATION; ALGORITHM; IDENTIFICATION; CONVERGENCE; RECOGNITION;
D O I
10.1109/TNN.2011.2109736
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance. Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible to overfitting and poor generalization. Moreover, training a HyperBF network demands the weights, centers, and local scaling factors to be optimized simultaneously. In the case of a relatively large dataset with a large network structure, such optimization becomes computationally challenging. In this paper, a new regularization method that performs soft local dimension reduction in addition to weight decay is proposed. The regularized HyperBF network is shown to provide classification accuracy competitive to a support vector machine while requiring a significantly smaller network structure. Furthermore, a practical training to construct HyperBF networks is presented. Hierarchal clustering is used to initialize neurons followed by a gradient optimization using a scaled version of the Rprop algorithm with a localized partial backtracking step. Experimental results on seven datasets show that the proposed training provides faster and smoother convergence than the regular Rprop algorithm.
引用
收藏
页码:673 / 686
页数:14
相关论文
共 58 条
[1]  
[Anonymous], 1996, Introduction to radial basis function networks
[2]  
[Anonymous], LIBSVM DATASETS
[3]  
Bartlett PL, 1997, ADV NEUR IN, V9, P134
[4]  
BERTERO M, 1986, LECT NOTES MATH, V1225, P52
[5]   A Growing and Pruning Method for Radial Basis Function Networks [J].
Bortman, M. ;
Aladjem, M. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (06) :1039-1045
[6]  
Broomhead D. S., 1988, Complex Systems, V2, P321
[7]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[8]   RECURSIVE HYBRID ALGORITHM FOR NONLINEAR-SYSTEM IDENTIFICATION USING RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
BILLINGS, SA ;
GRANT, PM .
INTERNATIONAL JOURNAL OF CONTROL, 1992, 55 (05) :1051-1070
[9]   On different facets of regularization theory [J].
Chen, Z ;
Haykin, S .
NEURAL COMPUTATION, 2002, 14 (12) :2791-2846
[10]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297