Noisy time series prediction using M-estimator based robust radial basis function neural networks with growing and pruning techniques

被引:40
作者
Lee, Chien-Cheng [1 ,2 ]
Chiang, Yu-Chun [3 ]
Shih, Cheng-Yuan [1 ]
Tsai, Chun-Li [4 ]
机构
[1] Yuan Ze Univ, Dept Commun Engn, Tao Yuan 320, Taiwan
[2] Yuan Ze Univ, Commun Res Ctr, Tao Yuan 320, Taiwan
[3] Yuan Ze Univ, Dept Mech Engn, Tao Yuan 320, Taiwan
[4] Natl Cheng Kung Univ, Dept Econ, Tainan 70101, Taiwan
关键词
Time-series; Radial basis function (RBF); Least-mean square (LMS); M-estimator; Growing and pruning (GAP); LEARNING ALGORITHM;
D O I
10.1016/j.eswa.2008.06.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Noisy time series prediction is attractive and challenging since it is essential in many fields, such as forecasting, modeling, signal processing, economic and business planning, Radial basis function (RBF) neural network is considered as a good candidate for the prediction problems due to its rapid learning capacity and, therefore, has been applied successfully to nonlinear time series modeling and forecasts. However, the traditional RBF network encounters two primary problems. The first one is that the network performance is very likely to be affected by noise. The second problem is about the determination of the number of hidden nodes. In this paper, we present an M-estimator based robust radial basis function (RBF) learning algorithm with growing and pruning techniques. The Welsch M-estimator and median scale estimator are employed to get rid of the influence from the noise. The concept of neuron significance is adopted to implement the growing and pruning techniques of network nodes. The proposed method not only eliminates the influence of noise, but also dynamically adjusts the number of neurons to approach an appropriate size of the network. The results from the experiments show that the proposed method can produce a minimum prediction error compared with other methods. Furthermore, even adding 30% additive noise of the magnitude of the data, this proposed method still can do a good performance. (C) 2008 Elsevier Ltd. All rights reserved.
引用
收藏
页码:4717 / 4724
页数:8
相关论文
共 22 条
[1]  
[Anonymous], 1997, NEURO FUZZY SOFT COM
[2]  
BARNETT, 1994, OUTLIERS STAT DATA
[3]   Fast algorithm for robust template matching with M-estimators [J].
Chen, JH ;
Chen, CS ;
Chen, YS .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2003, 51 (01) :230-243
[4]   Regularized orthogonal least squares algorithm for constructing radial basis function networks [J].
Chen, S ;
Chng, ES ;
Alkadhimi, K .
INTERNATIONAL JOURNAL OF CONTROL, 1996, 64 (05) :829-837
[5]   Local regularization assisted orthogonal least squares regression [J].
Chen, S .
NEUROCOMPUTING, 2006, 69 (4-6) :559-585
[6]   ORTHOGONAL LEAST-SQUARES LEARNING ALGORITHM FOR RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
GRANT, PM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (02) :302-309
[7]   Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks [J].
Chen, S ;
Wu, Y ;
Luk, BL .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05) :1239-1243
[8]   Time-series prediction using a local linear wavelet neural network [J].
Chen, YH ;
Yang, B ;
Dong, JW .
NEUROCOMPUTING, 2006, 69 (4-6) :449-465
[9]  
Chen Yuehui, 2004, Int J Neural Syst, V14, P125, DOI 10.1142/S0129065704001905
[10]   Gradient radial basis function networks for nonlinear and nonstationary time series prediction [J].
Chng, ES ;
Chen, S ;
Mulgrew, B .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (01) :190-194