Sample selection-based hierarchical extreme learning machine

被引:4
作者
Xu, Xinzheng [1 ,2 ,3 ]
Li, Shan [1 ,2 ]
Liang, Tianming [1 ]
Sun, Tongfeng [1 ,3 ]
机构
[1] China Univ Min & Technol, Sch Comp Sci & Technol, Xuzhou 221116, Jiangsu, Peoples R China
[2] Minist Educ, Engn Res Ctr Min Digital, Xuzhou 221116, Jiangsu, Peoples R China
[3] Lanzhou Jiaotong Univ, Key Lab Optotechnol & Intelligent Control, Minist Educ, Lanzhou 730070, Gansu, Peoples R China
关键词
Sample selection; Fuzzy C-means clustering; Condensed nearest neighbour; Hierarchical extreme learning machine; ELM; ALGORITHM; DEEP;
D O I
10.1016/j.neucom.2019.10.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Large amounts of training data in machine learning can keep the accuracy high to a certain extent, but the time costs are high because of the exorbitant amount of data and their dimensionality. Therefore, how to simultaneously select the most useful training data set and extract the main features of the samples, especially for image data, are essential problems that urgently need to be solved in the field of large-scale machine learning. Herein, a training sample selection method that is based on the fuzzy c-means clustering algorithm (FCM) is proposed for the problems. It first utilises condensed nearest neighbour (CNN) to make a preliminary selection of training samples. Then, it utilises the FCM to get the centres of the selected data, and, finally, it effectively condenses the sample using a compression parameter. Meanwhile, considering the critical influence of the sample features on the classification model, this paper selects the hierarchical extreme learning machine (H-ELM) model to better solve the classification task. Based on this, the paper presents the FCM-CNN-H-ELM framework for data classification, which combines FCM-Based CNN and H-ELM. The results of the experiments show that the proposed training sample selection method and classification framework can guarantee consistent, even higher, prediction results with a small number of training samples, and significantly reduce the training time. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:95 / 102
页数:8
相关论文
共 45 条
[31]   A novel hybrid CNN-SVM classifier for recognizing handwritten digits [J].
Niu, Xiao-Xiao ;
Suen, Ching Y. .
PATTERN RECOGNITION, 2012, 45 (04) :1318-1325
[32]  
Shao P., 2008, P 46 ANN SE REG C 20, P525, DOI [10.1145/1593105.1593248, DOI 10.1145/1593105.1593248]
[33]  
Tan MK, 2014, J MACH LEARN RES, V15, P1371
[34]   Extreme Learning Machine for Multilayer Perceptron [J].
Tang, Jiexiong ;
Deng, Chenwei ;
Huang, Guang-Bin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (04) :809-821
[35]   Postboosting Using Extended G-Mean for Online Sequential Multiclass Imbalance Learning [J].
Vong, Chi-Man ;
Du, Jie ;
Wong, Chi-Man ;
Cao, Jiu-Wen .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (12) :6163-6177
[36]   Initial-training-free online sequential extreme learning machine based adaptive engine air-fuel ratio control [J].
Wong, Pak Kin ;
Gao, Xiang Hui ;
Wong, Ka In ;
Vong, Chi Man ;
Yang, Zhi-Xin .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (09) :2245-2256
[37]  
Xu HT, 2015, AAAI CONF ARTIF INTE, P3108
[38]  
Xu Y, 2010, LECT NOTES ENG COMP, P273
[39]   Multi-Class Active Learning by Uncertainty Sampling with Diversity Maximization [J].
Yang, Yi ;
Ma, Zhigang ;
Nie, Feiping ;
Chang, Xiaojun ;
Hauptmann, Alexander G. .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 113 (02) :113-127
[40]   Multi-View CNN Feature Aggregation with ELM Auto-Encoder for 3D Shape Recognition [J].
Yang, Zhi-Xin ;
Tang, Lulu ;
Zhang, Kun ;
Wong, Pak Kin .
COGNITIVE COMPUTATION, 2018, 10 (06) :908-921