Extreme learning machine with local connections

被引:5
作者
Li, Feng [1 ]
Yang, Jie [2 ]
Yao, Mingchen [3 ]
Yang, Sibo [2 ]
Wu, Wei [2 ]
机构
[1] Dalian Maritime Univ, Sch Sci, Dalian 116026, Peoples R China
[2] Dalian Univ Technol, Sch Math Sci, Dalian 116024, Peoples R China
[3] Heilongjiang Univ, Sch Math Sci, Harbin 150080, Heilongjiang, Peoples R China
基金
美国国家科学基金会;
关键词
Extreme learning machine; Local connections; Sparsification of input-hidden weights; High dimensional input data; SMOOTHING L-1/2 REGULARIZATION; FEEDFORWARD NEURAL-NETWORKS; RECEPTIVE-FIELDS; GRADIENT-METHOD; CONVERGENCE;
D O I
10.1016/j.neucom.2019.08.069
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is concerned with the sparsification of the input-hidden weights of ELM (extreme learning machine). For ordinary feedforward neural networks, the sparsification is usually done by introducing certain regularization technique into the learning process of the network. However, this strategy cannot be applied for ELM, since the input-hidden weights of ELM are supposed to be randomly chosen rather than iteratively learned. To this end, we propose a modified ELM, called ELM-LC (ELM with local connections), which is designed for the sparsification of the input-hidden weights as follows: The hidden nodes and the input nodes are divided respectively into several corresponding groups, and each input node group is fully connected with its corresponding hidden node group, but is not connected with any other hidden node group. As in the usual ELM, the input-hidden weights are randomly given, and the hidden-output weights are obtained through a least square learning. In the numerical simulations on some benchmark problems, the new ELM-LC behaves better than the traditional ELM and the ELM with normal sparse input-hidden weights. (C) 2019 Published by Elsevier B.V.
引用
收藏
页码:146 / 152
页数:7
相关论文
共 38 条
[1]  
Bache K., 2013, UCI machine learning repository
[2]  
Ben-Israel A., 2003, Generalized Inverses: Theory and Applications, DOI [10.1007/b97366, DOI 10.1007/B97366]
[3]   Voting based extreme learning machine [J].
Cao, Jiuwen ;
Lin, Zhiping ;
Huang, Guang-Bin ;
Liu, Nan .
INFORMATION SCIENCES, 2012, 185 (01) :66-77
[4]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[5]   Empirical kernel map-based multilayer extreme learning machines for representation learning [J].
Chi-Man Vong ;
Chen, Chuangquan ;
Wong, Pak-Kin .
NEUROCOMPUTING, 2018, 310 :265-276
[6]   Robust Online Multilabel Learning Under Dynamic Changes in Data Distribution With Labels [J].
Du, Jie ;
Vong, Chi-Man .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (01) :374-385
[7]   The Shape Boltzmann Machine: A Strong Model of Object Shape [J].
Eslami, S. M. Ali ;
Heess, Nicolas ;
Williams, Christopher K. I. ;
Winn, John .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2014, 107 (02) :155-176
[8]   Convergence of online gradient method for feedforward neural networks with smoothing L1/2 regularization penalty [J].
Fan, Qinwei ;
Zurada, Jacek M. ;
Wu, Wei .
NEUROCOMPUTING, 2014, 131 :208-216
[9]  
Haykin S., 1994, Neural networks: A comprehensive foundation
[10]   Enhanced random search based incremental extreme learning machine [J].
Huang, Guang-Bin ;
Chen, Lei .
NEUROCOMPUTING, 2008, 71 (16-18) :3460-3468