A review on neural networks with random weights

被引:343
作者
Cao, Weipeng [1 ]
Wang, Xizhao [1 ]
Ming, Zhong [1 ]
Gao, Jinzhu [2 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Univ Pacific, Sch Engn & Comp Sci, Stockton, CA 95211 USA
基金
中国国家自然科学基金;
关键词
Feed-forward neural networks; Training mechanism; Neural networks with random weights; EXTREME LEARNING-MACHINE; ADAPTIVE FUNCTION APPROXIMATION; FUNCTIONAL-LINK NETWORK; STOCHASTIC CHOICE; MIXED SELECTIVITY; ALGORITHMS; NEURONS; SCHEME; DIMENSIONALITY; REGRESSION;
D O I
10.1016/j.neucom.2017.08.040
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In big data fields, with increasing computing capability, artificial neural networks have shown great strength in solving data classification and regression problems. The traditional training of neural networks depends generally on the error back propagation method to iteratively tune all the parameters. When the number of hidden layers increases, this kind of training has many problems such as slow convergence, time consuming, and local minima. To avoid these problems, neural networks with random weights (NNRW) are proposed in which the weights between the hidden layer and input layer are randomly selected and the weights between the output layer and hidden layer are obtained analytically. Researchers have shown that NNRW has much lower training complexity in comparison with the traditional training of feed-forward neural networks. This paper objectively reviews the advantages and disadvantages of NNRW model, tries to reveal the essence of NNRW, gives our comments and remarks on NNRW, and provides some useful guidelines for users to choose a mechanism to train a feed-forward neural network. (c) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:278 / 287
页数:10
相关论文
共 111 条
[91]   Distributed learning for Random Vector Functional-Link networks [J].
Scardapane, Simone ;
Wang, Dianhui ;
Panella, Massimo ;
Uncini, Aurelio .
INFORMATION SCIENCES, 2015, 301 :271-284
[92]  
SCHMIDT WF, 1992, 11TH IAPR INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, PROCEEDINGS, VOL II, P1, DOI 10.1109/ICPR.1992.201708
[93]   Generalized extreme learning machine autoencoder and a new deep neural network [J].
Sun, Kai ;
Zhang, Jiangshe ;
Zhang, Chunxia ;
Hu, Junying .
NEUROCOMPUTING, 2017, 230 :374-381
[94]   Extreme Learning Machine for Multilayer Perceptron [J].
Tang, Jiexiong ;
Deng, Chenwei ;
Huang, Guang-Bin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) :809-821
[95]   Feature and instance reduction for PNN classifiers based on fuzzy rough sets [J].
Tsang, Eric C. C. ;
Hu, Qinghua ;
Chen, Degang .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2016, 7 (01) :1-11
[96]  
Vincent P, 2010, J MACH LEARN RES, V11, P3371
[97]   A novel face recognition method: Using random weight networks and quasi-singular value decomposition [J].
Wan, Wanggen ;
Zhou, Zhenghua ;
Zhao, Jianwei ;
Cao, Feilong .
NEUROCOMPUTING, 2015, 151 :1180-1186
[98]   Fast prediction of protein-protein interaction sites based on Extreme Learning Machines [J].
Wang, Debby A. ;
Wang, Ran ;
Yan, Hong .
NEUROCOMPUTING, 2014, 128 :258-266
[99]   A protein secondary structure prediction framework based on the Extreme Learning Machine [J].
Wang, Guoren ;
Zhao, Yi ;
Wang, Di .
NEUROCOMPUTING, 2008, 72 (1-3) :262-268
[100]   Neuronal remapping and circuit persistence in economic decisions [J].
Xie, Jue ;
Padoa-Schioppa, Camillo .
NATURE NEUROSCIENCE, 2016, 19 (06) :855-+