Ensemble of Extreme Learning Machines with trained classifier combination and statistical features for hyperspectral data

被引:16
作者
Ksieniewicz, Pawel [1 ]
Krawczyk, Bartosz [2 ]
Wozniak, Michal [1 ]
机构
[1] Wroclaw Univ Technol, Dept Syst & Comp Networks, Wybrzeze Wyspianskiego 27, PL-50370 Wroclaw, Poland
[2] Virginia Commonwealth Univ, Dept Comp Sci, Med Coll Virginia Campus, Richmond, VA 23284 USA
关键词
Ensemble learning; Extreme Learning Machines; Hyperspectral imaging; Computer vision; Feature extraction; Dimensionality reduction; Image classification; STRATEGY; DISPLAY; SYSTEMS;
D O I
10.1016/j.neucom.2016.04.076
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Remote sensing and hyperspectral data analysis are areas offering wide range of valuable practical applications. However, they generate massive and complex data that is very difficult to be analyzed by a human being. Therefore, methods for efficient data representation and data mining are of high interest to these fields. In this paper, we introduce a novel pipeline for feature extraction and classification of hyperspectral images. To obtain a compressed representation we propose to extract a set of statistical-based properties from these images. This allows for embedding feature space into fourteen channels, obtaining a significant dimensionality reduction. These features are used as an input for the ensemble learning based on randomized neural networks. We introduce a novel method for forming ensembles of Extreme Learning Machines based on randomized feature subspaces and a trained combiner. It is based on continuous outputs and uses a perceptron-based learning scheme to calculate weights assigned to each classifier and class independently. Extensive experiments carried on a number of benchmarks images prove that using proposed feature extraction and extreme learning ensemble leads to a significant gain in classification accuracy. (C) 2017Elsevier B.V. All rights reserved.
引用
收藏
页码:28 / 37
页数:10
相关论文
共 43 条
[1]  
Agarwal Abhishek, 2007, 2007 IEEE International Symposium on Signal Processing and Information Technology, P353, DOI 10.1109/ISSPIT.2007.4458191
[2]   Combined 5 x 2 cv F test for comparing supervised classification learning algorithms [J].
Alpaydin, E .
NEURAL COMPUTATION, 1999, 11 (08) :1885-1892
[3]  
[Anonymous], 2004, COMBINING PATTERN CL, DOI DOI 10.1002/0471660264
[4]  
[Anonymous], 2004, Machine vision: theory, algorithms, practicalities
[5]   Hyperspectral image nonlinear unmixing and reconstruction by ELM regression ensemble [J].
Ayerdi, Borja ;
Grana, Manuel .
NEUROCOMPUTING, 2016, 174 :299-309
[6]   Hybrid extreme rotation forest [J].
Ayerdi, Borja ;
Grana, Manuel .
NEURAL NETWORKS, 2014, 52 :33-42
[7]   Toward an optimal SVM classification system for hyperspectral remote sensing images [J].
Bazi, Yakoub ;
Melgani, Farid .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2006, 44 (11) :3374-3385
[8]  
Bennett KP, 1999, ADV NEUR IN, V11, P368
[9]   Optimized Parameter Search for Large Datasets of the Regularization Parameter and Feature Selection for Ridge Regression [J].
Buteneers, Pieter ;
Caluwaerts, Ken ;
Dambre, Joni ;
Verstraeten, David ;
Schrauwen, Benjamin .
NEURAL PROCESSING LETTERS, 2013, 38 (03) :403-416
[10]   Class-specific soft voting based multiple extreme learning machines ensemble [J].
Cao, Jingjing ;
Kwong, Sam ;
Wang, Ran ;
Li, Xiaodong ;
Kong, Xiangfei .
NEUROCOMPUTING, 2015, 149 :275-284