Class-specific cost-sensitive boosting weighted ELM for class imbalance learning

被引:0
作者
Bhagat Singh Raghuwanshi
Sanyam Shukla
机构
[1] Maulana Azad National Institute of Technology,Department of Computer Science and Engineering
来源
Memetic Computing | 2019年 / 11卷
关键词
Extreme learning machine; Classification; Class imbalance problem; Cost-sensitive boosting; AdaBoost;
D O I
暂无
中图分类号
学科分类号
摘要
Class imbalance problem happens when the training dataset contains significantly fewer instances of one class compared to another class. Traditional classification algorithms like extreme learning machine (ELM) and support vector machine (SVM) are biased towards the majority class. They minimize the least squares error due to which the minority class instances are usually misclassified. The classification algorithms like weighted ELM (WELM) and weighted SVM minimize the weighted least squares error to address the class imbalance problem effectively. The variants of WELM like boosting WELM, ensemble WELM etc. further improve the performance of WELM by employing ensemble method. This work proposes class-specific AdaC1, class-specific AdaC2 and class-specific AdaC3 algorithms for addressing the class imbalance problem more effectively. This work employs kernelized WELM as the component classifier to make the proposed ensemble. The proposed classifier ensembles are the variants of AdaC1, AdaC2 and AdaC3 algorithms. The cost-sensitive boosting classifiers AdaC1, AdaC2 and AdaC3 assign initial weights to the training instances without considering class skewness. This work assigns the initial weights to the training instances based on the class skewness. Moreover, the proposed ensemble rescales the weights assigned to the instances of each class after each iteration such that the total weight assigned to the instances belonging to each class remains equal. The proposed algorithms are assessed by employing the benchmark real-world imbalanced datasets downloaded from the KEEL dataset repository. The superiority of the proposed work over the other state-of-the-art ensemble methods for the class imbalance learning can be observed from the experimental results.
引用
收藏
页码:263 / 283
页数:20
相关论文
共 67 条
[1]  
Huang GB(2006)Extreme learning machine: theory and applications Neurocomputing 70 513-529
[2]  
Zhu QY(2012)Extreme learning machine for regression and multiclass classification IEEE Trans Syst Man Cybern Part B (Cybern) 42 513-529
[3]  
Siew CK(2014)Boosting weighted ELM for imbalanced learning Neurocomputing 128 15-21
[4]  
Huang GB(2017)Class-specific cost regulation extreme learning machine for imbalanced classification Neurocomputing 261 70-82
[5]  
Zhou H(2017)Ensemble weighted extreme learning machine for imbalanced data classification based on differential evolution Neural Comput Appl 28 259-267
[6]  
Ding X(2013)Weighted extreme learning machine for imbalance learning Neurocomputing 101 229-242
[7]  
Zhang R(1999)Improved boosting algorithms using confidence-rated predictions Mach Learn 37 297-336
[8]  
Li K(2007)Cost-sensitive boosting for classification of imbalanced data Pattern Recogn 40 3358-3378
[9]  
Kong X(2009)Exploratory undersampling for class-imbalance learning IEEE Trans Syst Man Cybern Part B (Cybern) 39 539-550
[10]  
Lu Z(2010)Rusboost: a hybrid approach to alleviating class imbalance IEEE Trans Syst Man Cybern Part A Syst Hum 40 185-197