Adaptive Learning for Robust Radial Basis Function Networks

被引:31
作者
Seghouane, Abd-Krim [1 ]
Shokouhi, Navid [1 ]
机构
[1] Univ Melbourne, Dept Elect & Elect Engn, Melbourne, Vic 3010, Australia
基金
澳大利亚研究理事会;
关键词
alpha-divergence; output linear parameters; radial basis function networks (RBFNs); robust estimation; NEURAL-NETWORKS; FUNCTION APPROXIMATION; FEEDFORWARD NETWORKS; HIDDEN UNITS; CLASSIFICATION; ALGORITHM; MACHINE;
D O I
10.1109/TCYB.2019.2951811
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article addresses the robust estimation of the output layer linear parameters in a radial basis function network (RBFN). A prominent method used to estimate the output layer parameters in an RBFN with the predetermined hidden layer parameters is the least-squares estimation, which is the maximum-likelihood (ML) solution in the specific case of the Gaussian noise. We highlight the connection between the ML estimation and minimizing the Kullback-Leibler (KL) divergence between the actual noise distribution and the assumed Gaussian noise. Based on this connection, a method is proposed using a variant of a generalized KL divergence, which is known to be more robust to outliers in the pattern recognition and machine-learning problems. The proposed approach produces a surrogate-likelihood function, which is robust in the sense that it is adaptive to a broader class of noise distributions. Several signal processing experiments are conducted using artificially generated and real-world data. It is shown that in all cases, the proposed adaptive learning algorithm outperforms the standard approaches in terms of mean-squared error (MSE). Using the relative increase in the MSE for different noise conditions, we compare the robustness of our proposed algorithm with the existing methods for robust RBFN training and show that our method results in overall improvement in terms of absolute MSE values and consistency.
引用
收藏
页码:2847 / 2856
页数:10
相关论文
共 50 条
[1]  
Akaike H., 1973, 2 INT S INF THEOR, P267, DOI [DOI 10.1007/978-1-4612-1694-0_15, DOI 10.1007/978-1-4612-1694-015]
[2]   α-Divergence Is Unique, Belonging to Both f-Divergence and Bregman Divergence Classes [J].
Amari, Shun-Ichi .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2009, 55 (11) :4925-4931
[3]   A Fast Adaptive Tunable RBF Network For Nonstationary Systems [J].
Chen, Hao ;
Gong, Yu ;
Hong, Xia ;
Chen, Sheng .
IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (12) :2683-2692
[4]   Online Modeling With Tunable RBF Network [J].
Chen, Hao ;
Gong, Yu ;
Hong, Xia .
IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (03) :935-947
[5]   RECURSIVE HYBRID ALGORITHM FOR NONLINEAR-SYSTEM IDENTIFICATION USING RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
BILLINGS, SA ;
GRANT, PM .
INTERNATIONAL JOURNAL OF CONTROL, 1992, 55 (05) :1051-1070
[6]   ORTHOGONAL LEAST-SQUARES LEARNING ALGORITHM FOR RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
GRANT, PM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (02) :302-309
[7]  
Chuang CC, 2004, NEUROCOMPUTING, V56, P123, DOI 10.1016/S0925-2312(03 )00436-3
[8]   Families of Alpha- Beta- and Gamma- Divergences: Flexible and Robust Measures of Similarities [J].
Cichocki, Andrzej ;
Amari, Shun-ichi .
ENTROPY, 2010, 12 (06) :1532-1568
[9]  
Coates A., 2011, JMLR WORKSHOP C P
[10]  
Cochocki A., 1993, Neural Networks for Optimization and Signal Processing