A comprehensive evaluation of random vector functional link networks

被引:335
作者
Zhang, Le [1 ]
Suganthan, P. N. [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
关键词
Random vector functional link networks; Ridge regression; Moore-Penrose pseudoinverse; Data classification; NEURAL-NETWORKS; CLASSIFICATION; APPROXIMATION; NET;
D O I
10.1016/j.ins.2015.09.025
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With randomly generated weights between input and hidden layers, a random vector functional link network is a universal approximator for continuous functions on compact sets with fast learning property. Though it was proposed two decades ago, the classification ability of this family of networks has not been fully investigated yet. Through a very comprehensive evaluation by using 121 UCI datasets, the effect of bias in the output layer, direct links from the input layer to the output layer and type of activation functions in the hidden layer, scaling of parameter randomization as well as the solution procedure for the output weights are investigated in this work. Surprisingly, we found that the direct link plays an important performance enhancing role in RVFL, while the bias term in the output neuron had no significant effect. The ridge regression based closed-form solution was better than those with Moore-Penrose pseudoinverse. Instead of using a uniform randomization in [-1,+1] for all datasets, tuning the scaling of the uniform randomization range for each dataset enhances the overall performance. Six commonly used activation functions were investigated in this work and we found that hardlim and sign activation functions degenerate the overall performance. These basic conclusions can serve as general guidelines for designing RVFL networks based classifiers. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:1094 / 1105
页数:12
相关论文
共 39 条
[1]   Fast decorrelated neural network ensembles with random weights [J].
Alhamdoosh, Monther ;
Wang, Dianhui .
INFORMATION SCIENCES, 2014, 264 :104-117
[2]  
An S., 2007, P IEEE C COMP VIS PA, DOI 10.1109/CVPR.2007.383105.
[3]  
[Anonymous], 1963, DISTRIBUTION FREE MU
[4]  
[Anonymous], 2014, SCI WORLD J
[5]  
[Anonymous], 1989, P ADV NEUR INF PROC
[6]   The sample complexity of pattern classification with neural networks: The size of the weights is more important than the size of the network [J].
Bartlett, PL .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (02) :525-536
[7]  
Breiman L, 1998, ANN STAT, V26, P801
[8]   A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction [J].
Chen, CLP ;
Wan, JZ .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1999, 29 (01) :62-72
[9]   A statistical self-organizing learning system for remote sensing classification [J].
Chi, HM ;
Ersoy, MK .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2005, 43 (08) :1890-1900
[10]   A comprehensive survey on functional link neural networks and an adaptive PSO-BP learning for CFLNN [J].
Dehuri, Satchidananda ;
Cho, Sung-Bae .
NEURAL COMPUTING & APPLICATIONS, 2010, 19 (02) :187-205