A Heterogeneous Ensemble of Extreme Learning Machines with Correntropy and Negative Correlation

被引:0
作者
Abuassba, Adnan O. M. [1 ,2 ]
Zhang, Yao [3 ]
Luo, Xiong [1 ,2 ]
Zhang, Dezheng [1 ,2 ]
Aziguli, Wulamu [1 ,2 ]
机构
[1] USTB, Sch Comp & Commun Engn, Beijing 100083, Peoples R China
[2] Beijing Key Lab Knowledge Engn Mat Sci, Beijing 100083, Peoples R China
[3] NYU, Tandon Sch Engn, Brooklyn, NY 11201 USA
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Extreme Learning Machine (ELM); ensemble; classification; correntropy; negative correlation; WIRELESS SENSOR NETWORKS; FUSION SCHEME; CLASSIFICATION;
D O I
10.23919/TST.2017.8195351
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Extreme Learning Machine (ELM) is an effective learning algorithm for a Single-Layer Feedforward Network (SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical applications, its performance might be affected by the noise in the training data. To tackle the noise issue, we propose a novel heterogeneous ensemble of ELMs in this article. Specifically, the correntropy is used to achieve insensitive performance to outliers, while implementing Negative Correlation Learning (NCL) to enhance diversity among the ensemble. The proposed Heterogeneous Ensemble of ELMs ((HELM)-L-2) for classification has different ELM algorithms including the Regularized ELM (RELM), the Kernel ELM (KELM), and the L-2-norm-optimized ELM (ELML2). The ensemble is constructed by training a randomly selected ELM classifier on a subset of the training data selected through random resampling. Then, the class label of unseen data is predicted using a maximum weighted sum approach. After splitting the training data into subsets, the proposed HE2LM is tested through classification and regression tasks on real-world benchmark datasets and synthetic datasets. Hence, the simulation results show that compared with other algorithms, our proposed method can achieve higher prediction accuracy, better generalization, and less sensitivity to outliers.
引用
收藏
页码:691 / 701
页数:11
相关论文
共 38 条
[1]   Improving Classification Performance through an Advanced Ensemble Based Heterogeneous Extreme Learning Machines [J].
Abuassba, Adnan O. M. ;
Zhang, Dezheng ;
Luo, Xiong ;
Shaheryar, Ahmad ;
Ali, Hazrat .
COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2017, 2017
[2]  
[Anonymous], 1996, P 13 INT C MACH LEAR
[3]   Prediction of hydrological time-series using extreme learning machine [J].
Atiquzzaman, Md ;
Kandasamy, Jaya .
JOURNAL OF HYDROINFORMATICS, 2016, 18 (02) :345-353
[4]  
Breiman L., 1996, Bagging Predictors
[5]   Landmark recognition with compact BoW histogram and ensemble ELM [J].
Cao, Jiuwen ;
Chen, Tao ;
Fan, Jiayuan .
MULTIMEDIA TOOLS AND APPLICATIONS, 2016, 75 (05) :2839-2857
[6]   Regularized Extreme Learning Machine [J].
Deng, Wanyu ;
Zheng, Qinghua ;
Chen, Lin .
2009 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING, 2009, :389-395
[7]  
Freund Y., 1999, Journal of Japanese Society for Artificial Intelligence, V14, P771
[8]   Extreme Learning Machine for Regression and Multiclass Classification [J].
Huang, Guang-Bin ;
Zhou, Hongming ;
Ding, Xiaojian ;
Zhang, Rui .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2012, 42 (02) :513-529
[9]   Optimization method based extreme learning machine for classification [J].
Huang, Guang-Bin ;
Ding, Xiaojian ;
Zhou, Hongming .
NEUROCOMPUTING, 2010, 74 (1-3) :155-163
[10]   Parallel ensemble of online sequential extreme learning machine based on MapReduce [J].
Huang, Shan ;
Wang, Botao ;
Qiu, Junhao ;
Yao, Jitao ;
Wang, Guoren ;
Yu, Ge .
NEUROCOMPUTING, 2016, 174 :352-367