Stacked autoencoder based deep random vector functional link neural network for classification

被引:88
作者
Katuwal, Rakesh [1 ]
Suganthan, P. N. [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
关键词
Random Vector Functional Link (RVFL); Deep RVFL; Multi-layer RVFL; Randomized neural network; KERNEL RIDGE-REGRESSION; CLASSIFIERS; ALGORITHMS; ENSEMBLE;
D O I
10.1016/j.asoc.2019.105854
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extreme learning machine (ELM), which can be viewed as a variant of Random Vector Functional Link (RVFL) network without the input-output direct connections, has been extensively used to create multi-layer (deep) neural networks. Such networks employ randomization based autoencoders (AE) for unsupervised feature extraction followed by an ELM classifier for final decision making. Each randomization based AE acts as an independent feature extractor and a deep network is obtained by stacking several such AEs. Inspired by the better performance of RVFL over ELM, in this paper, we propose several deep RVFL variants by utilizing the framework of stacked autoencoders. Specifically, we introduce direct connections (feature reuse) from preceding layers to the fore layers of the network as in the original RVFL network. Such connections help to regularize the randomization and also reduce the model complexity. Furthermore, we also introduce denoising criterion, recovering clean inputs from their corrupted versions, in the autoencoders to achieve better higher level representations than the ordinary autoencoders. Extensive experiments on several classification datasets show that our proposed deep networks achieve overall better and faster generalization than the other relevant state-of-the-art deep neural networks. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 60 条
[51]   A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression [J].
Vukovic, Najdan ;
Petrovic, Milica ;
Miljkovic, Zoran .
APPLIED SOFT COMPUTING, 2018, 70 :1083-1096
[52]  
White H, 2006, HBK ECON, V24, P459, DOI 10.1016/S1574-0706(05)01009-8
[53]   The No-Prop algorithm: A new learning algorithm for multilayer neural networks [J].
Widrow, Bernard ;
Greenblatt, Aaron ;
Kim, Youngsik ;
Park, Dookun .
NEURAL NETWORKS, 2013, 37 :180-186
[54]   Kernel-Based Multilayer Extreme Learning Machines for Representation Learning [J].
Wong, Chi Man ;
Vong, Chi Man ;
Wong, Pak Kin ;
Cao, Jiuwen .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (03) :757-762
[55]   Benchmarking Ensemble Classifiers with Novel Co-Trained Kernel Ridge Regression and Random Vector Functional Link Ensembles [J].
Zhang, Le ;
Suganthan, Ponnuthurai Nagaratnam .
IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2017, 12 (04) :61-72
[56]   Visual Tracking With Convolutional Random Vector Functional Link Network [J].
Zhang, Le ;
Suganthan, Ponnuthurai Nagaratnam .
IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (10) :3243-3253
[57]   A comprehensive evaluation of random vector functional link networks [J].
Zhang, Le ;
Suganthan, P. N. .
INFORMATION SCIENCES, 2016, 367 :1094-1105
[58]   A survey of randomized algorithms for training neural networks [J].
Zhang, Le ;
Suganthan, P. N. .
INFORMATION SCIENCES, 2016, 364 :146-155
[59]   An unsupervised parameter learning model for RVFL neural network [J].
Zhang, Yongshan ;
Wu, Jia ;
Cai, Zhihua ;
Du, Bo ;
Yu, Philip S. .
NEURAL NETWORKS, 2019, 112 :85-97
[60]  
Zhang YC, 2015, J MACH LEARN RES, V16, P3299