A heterogeneous ensemble of extreme learning machines with correntropy and negative correlation

被引:0
作者
Abuassba A.O.M. [1 ,2 ]
Zhang Y. [3 ]
Luo X. [1 ,2 ]
Zhang D. [1 ,2 ]
Aziguli W. [1 ,2 ]
机构
[1] School of Computer and Communication Engineering, University of Science and Technology Beijing (USTB), Beijing
[2] Beijing Key Laboratory of Knowledge Engineering for Materials Science, Beijing
[3] Tandon School of Engineering, New York University, Brooklyn, 11201, NY
来源
Luo, Xiong (xluo@ustb.edu.cn) | 2017年 / Tsinghua University卷 / 22期
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
classification; correntropy; ensemble; Extreme Learning Machine (ELM); negative correlation;
D O I
10.23919/TST.2017.8195351
中图分类号
学科分类号
摘要
The Extreme Learning Machine (ELM) is an effective learning algorithm for a Single-Layer Feedforward Network (SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical applications, its performance might be affected by the noise in the training data. To tackle the noise issue, we propose a novel heterogeneous ensemble of ELMs in this article. Specifically, the correntropy is used to achieve insensitive performance to outliers, while implementing Negative Correlation Learning (NCL) to enhance diversity among the ensemble. The proposed Heterogeneous Ensemble of ELMs (HE2LM) for classification has different ELM algorithms including the Regularized ELM (RELM), the Kernel ELM (KELM), and the L2-norm-optimized ELM (ELML2). The ensemble is constructed by training a randomly selected ELM classifier on a subset of the training data selected through random resampling. Then, the class label of unseen data is predicted using a maximum weighted sum approach. After splitting the training data into subsets, the proposed HE2LM is tested through classification and regression tasks on real-world benchmark datasets and synthetic datasets. Hence, the simulation results show that compared with other algorithms, our proposed method can achieve higher prediction accuracy, better generalization, and less sensitivity to outliers. © 1996-2012 Tsinghua University Press.
引用
收藏
页码:691 / 701
页数:10
相关论文
共 38 条
  • [1] Lofstrom T., On Effectively Creating Ensembles of Classifiers: Studies on Creation Strategies, Diversity and Predicting with Confidence, (2015)
  • [2] Perez-Diaz N., Ruano-Ordas D., Fdez-Riverola F., Mendez J.R., Boosting accuracy of classical machine learning antispam classifiers in real scenarios by applying rough set theory, Sci. Program
  • [3] Breiman L., Bagging Predictors, (1996)
  • [4] Freund Y., Schapire R., Abe N., A short introduction to Boosting, J. Japanese Soc. Artificial Intelligence, 14, 5, pp. 771-780, (1999)
  • [5] Freund Y., Schapire R.E., Experiments with a new Boosting algorithm, Proc. 13th Int. Conf. Machine Learning, pp. 1-9, (1996)
  • [6] Luo X., Liu J., Zhang D.D., Chang X.H., A largescale web QoS prediction scheme for the industrial Internet of Things based on a kernel machine learning algorithm, Comput. Networks, 101, pp. 81-89, (2016)
  • [7] Luo X., Deng J., Liu J., Wang W.P., Ban X.J., Wang J.H., A quantized kernel least mean square scheme with entropy-guided learning for intelligent data analysis, China Commun., 14, 7, pp. 127-136, (2017)
  • [8] Miche Y., Sorjamaa A., Bas P., Simula O., Jutten C., Lendasse A., OP-ELM: Optimally pruned extreme learning machine, IEEE Trans. Neural Networks, 21, 1, pp. 158-162, (2010)
  • [9] Huang G.B., Zhou H., Ding X., Zhang R., Extreme learning machine for regression and multiclass classification, IEEE Trans. Syst. Man Cybern. Part B Cybern., 42, 2, pp. 513-529, (2012)
  • [10] Atiquzzaman M., Kandasamy J., Prediction of hydrological time-series using extreme learning machine, J. Hydroinform., 18, 2, pp. 345-353, (2016)