Non-iterative and Fast Deep Learning: Multilayer Extreme Learning Machines

被引:153
作者
Zhang, Jie [1 ]
Li, Yanjiao [2 ]
Xiao, Wendong [3 ]
Zhang, Zhiqiang [4 ]
机构
[1] Peking Univ, Sch Elect Engn & Comp Sci, Beijing 100871, Peoples R China
[2] Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
[3] Univ Sci & Technol Beijing, Sch Automat & Elect Engn, Beijing 100083, Peoples R China
[4] Univ Leeds, Sch Elect & Elect Engn, Leeds LS2 9JT, W Yorkshire, England
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
NEURAL-NETWORKS; MIXED SELECTIVITY; RECOGNITION; CLASSIFICATION; ALGORITHM; REPRESENTATIONS; APPROXIMATION; ARCHITECTURE; PREDICTION; FEATURES;
D O I
10.1016/j.jfranklin.2020.04.033
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the past decade, deep learning techniques have powered many aspects of our daily life, and drawn ever-increasing research interests. However, conventional deep learning approaches, such as deep belief network (DBN), restricted Boltzmann machine (RBM), and convolutional neural network (CNN), suffer from time-consuming training process due to fine-tuning of a large number of parameters and the complicated hierarchical structure. Furthermore, the above complication makes it difficult to theoretically analyze and prove the universal approximation of those conventional deep learning approaches. In order to tackle the issues, multilayer extreme learning machines (ML-ELM) were proposed, which accelerate the development of deep learning. Compared with conventional deep learning, ML-ELMs are non-iterative and fast due to the random feature mapping mechanism. In this paper, we perform a thorough review on the development of ML-ELMs, including stacked ELM autoencoder (ELM-AE), residual ELM, and local receptive field based ELM (ELM-LRF), as well as address their applications. In addition, we also discuss the connection between random neural networks and conventional deep learning. (C) 2020 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:8925 / 8955
页数:31
相关论文
共 141 条
[111]   An efficient and effective convolutional auto-encoder extreme learning machine network for 3d feature learning [J].
Wang, Yueqing ;
Xie, Zhige ;
Xu, Kai ;
Dou, Yong ;
Lei, Yuanwu .
NEUROCOMPUTING, 2016, 174 :988-998
[112]   Weakly paired multimodal fusion using multilayer extreme learning machine [J].
Wen, Xiaohong ;
Liu, Huaping ;
Yan, Gaowei ;
Sun, Fuchun .
SOFT COMPUTING, 2018, 22 (11) :3533-3544
[113]   Kernel-Based Multilayer Extreme Learning Machines for Representation Learning [J].
Wong, Chi Man ;
Vong, Chi Man ;
Wong, Pak Kin ;
Cao, Jiuwen .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (03) :757-762
[114]   Online extreme learning machine based modeling and optimization for point-by-point engine calibration [J].
Wong, Pak Kin ;
Gao, Xiang Hui ;
Wong, Ka In ;
Vong, Chi Man .
NEUROCOMPUTING, 2018, 277 :187-197
[115]   A Regression Method With Subnetwork Neurons for Vigilance Estimation Using EOG and EEG [J].
Wu, Wei ;
Wu, Q. M. Jonathan ;
Sun, Wei ;
Yang, Yimin ;
Yuan, Xiaofang ;
Zheng, Wei-Long ;
Lu, Bao-Liang .
IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2021, 13 (01) :209-222
[116]   Class-specific cost regulation extreme learning machine for imbalanced classification [J].
Xiao, Wendong ;
Zhang, Jie ;
Li, Yanjiao ;
Zhang, Sen ;
Yang, Weidong .
NEUROCOMPUTING, 2017, 261 :70-82
[117]  
Xue QK, 2016, PROC IEEE MICR ELECT, P1, DOI 10.1109/MEMSYS.2016.7421541
[118]   Fast Adaptation of Deep Neural Network Based on Discriminant Codes for Speech Recognition [J].
Xue, Shaofei ;
Abdel-Hamid, Ossama ;
Jiang, Hui ;
Dai, Lirong ;
Liu, Qingfeng .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2014, 22 (12) :1713-1725
[119]  
Yang Y., 2018, IEEE T CYBERN, V45, P1463
[120]   Hierarchical extreme learning machine based image denoising network for visual Internet of Things [J].
Yang, Yifan ;
Zhang, Hong ;
Yuan, Ding ;
Sun, Daniel ;
Li, Guoqiang ;
Ranjan, Rajiv ;
Sun, Mingui .
APPLIED SOFT COMPUTING, 2019, 74 :747-759