Generating random weights and biases in feedforward neural networks with random hidden nodes

被引:38
作者
Dudek, Grzegorz [1 ]
机构
[1] Czestochowa Tech Univ, Fac Elect Engn, 17 Armii Krajowej Ave, PL-42200 Czestochowa, Poland
关键词
Activation functions; Function approximation; Feedforward neural networks; Neural networks with random hidden nodes; Randomized learning algorithms; APPROXIMATION;
D O I
10.1016/j.ins.2018.12.063
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Neural networks with random hidden nodes have gained increasing interest from researchers and practical applications. This is due to their unique features such as very fast training and universal approximation property. In these networks the weights and biases of hidden nodes determining the nonlinear feature mapping are set randomly and are not learned. Appropriate selection of the intervals from which weights and biases are selected is extremely important. This topic has not yet been sufficiently explored in the literature. In this work a method of generating random weights and biases is proposed. This method generates the parameters of the hidden nodes in such a way that nonlinear fragments of the activation functions are located in the input space regions with data and can be used to construct the surface approximating a nonlinear target function. The weights and biases are dependent on the input data range and activation function type. The proposed methods allows us to control the generalization degree of the model. These all lead to improvement in approximation performance of the network. Several experiments show very promising results. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:33 / 56
页数:24
相关论文
共 15 条
[1]  
[Anonymous], INF SCI
[2]   Smooth function approximation using neural networks [J].
Ferrari, S ;
Stengel, RF .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (01) :24-38
[3]   TRAINING FEEDFORWARD NETWORKS WITH THE MARQUARDT ALGORITHM [J].
HAGAN, MT ;
MENHAJ, MB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (06) :989-993
[4]  
Husmeier D., 1999, RANDOM VECTOR FUNCTI
[5]   STOCHASTIC CHOICE OF BASIS FUNCTIONS IN ADAPTIVE FUNCTION APPROXIMATION AND THE FUNCTIONAL-LINK NET [J].
IGELNIK, B ;
PAO, YH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (06) :1320-1329
[6]   Objective functions for training new hidden units in constructive neural networks [J].
Kwok, TY ;
Yeung, DY .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (05) :1131-1148
[7]   Insights into randomized algorithms for neural networks: Practical issues and common pitfalls [J].
Li, Ming ;
Wang, Dianhui .
INFORMATION SCIENCES, 2017, 382 :170-178
[8]  
Nguyen D., 1990, IJCNN International Joint Conference on Neural Networks (Cat. No.90CH2879-5), P21, DOI 10.1109/IJCNN.1990.137819
[9]   LEARNING AND GENERALIZATION CHARACTERISTICS OF THE RANDOM VECTOR FUNCTIONAL-LINK NET [J].
PAO, YH ;
PARK, GH ;
SOBAJIC, DJ .
NEUROCOMPUTING, 1994, 6 (02) :163-180
[10]   FUNCTIONAL-LINK NET COMPUTING - THEORY, SYSTEM ARCHITECTURE, AND FUNCTIONALITIES [J].
PAO, YH ;
TAKEFUJI, Y .
COMPUTER, 1992, 25 (05) :76-79