Class Imbalance Ensemble Learning Based on the Margin Theory

被引:113
作者
Feng, Wei [1 ]
Huang, Wenjiang [1 ]
Ren, Jinchang [2 ]
机构
[1] Chinese Acad Sci, Inst Remote Sensing & Digital Earth, Key Lab Digital Earth Sci, Beijing 100094, Peoples R China
[2] Univ Strathclyde, Dept Elect & Elect Engn, Glasgow G1 1XW, Lanark, Scotland
来源
APPLIED SCIENCES-BASEL | 2018年 / 8卷 / 05期
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
classification; ensemble margin; imbalance learning; ensemble learning; multi-class; SUPPORT VECTOR MACHINES; DATA-SETS; STATISTICAL COMPARISONS; CLASSIFICATION; PERFORMANCE; DIVERSITY; CLASSIFIERS; PREDICTION; ALGORITHM; IMPROVE;
D O I
10.3390/app8050815
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
The proportion of instances belonging to each class in a data-set plays an important role in machine learning. However, the real world data often suffer from class imbalance. Dealing with multi-class tasks with different misclassification costs of classes is harder than dealing with two-class ones. Undersampling and oversampling are two of the most popular data preprocessing techniques dealing with imbalanced data-sets. Ensemble classifiers have been shown to be more effective than data sampling techniques to enhance the classification performance of imbalanced data. Moreover, the combination of ensemble learning with sampling methods to tackle the class imbalance problem has led to several proposals in the literature, with positive results. The ensemble margin is a fundamental concept in ensemble learning. Several studies have shown that the generalization performance of an ensemble classifier is related to the distribution of its margins on the training examples. In this paper, we propose a novel ensemble margin based algorithm, which handles imbalanced classification by employing more low margin examples which are more informative than high margin samples. This algorithm combines ensemble learning with undersampling, but instead of balancing classes randomly such as UnderBagging, our method pays attention to constructing higher quality balanced sets for each base classifier. In order to demonstrate the effectiveness of the proposed method in handling class imbalanced data, UnderBagging and SMOTEBagging are used in a comparative analysis. In addition, we also compare the performances of different ensemble margin definitions, including both supervised and unsupervised margins, in class imbalance learning.
引用
收藏
页数:28
相关论文
共 78 条
[1]   Applying support vector machines to imbalanced datasets [J].
Akbani, R ;
Kwek, S ;
Japkowicz, N .
MACHINE LEARNING: ECML 2004, PROCEEDINGS, 2004, 3201 :39-50
[2]  
[Anonymous], 2006, 23 INT C MACH LEARN, DOI [10.1145/1143844.1143874, DOI 10.1145/1143844.1143874]
[3]  
[Anonymous], 2001, Neural Networks: A Comprehensive Foundation
[4]  
[Anonymous], 1986, Statist. Sci.
[5]  
[Anonymous], 2017, THESIS
[6]  
[Anonymous], THESIS
[7]   New applications of ensembles of classifiers [J].
Barandela, R ;
Sánchez, JS ;
Valdovinos, RM .
PATTERN ANALYSIS AND APPLICATIONS, 2003, 6 (03) :245-256
[8]  
Batista GE., 2004, ACM SIGKDD EXPL NEWS, V6, P20, DOI [DOI 10.1145/1007730.1007735, 10.1145/1007730.1007735]
[9]   Extending Bagging for Imbalanced Data [J].
Blaszczynski, Jerzy ;
Stefanowski, Jerzy ;
Idkowiak, Lukasz .
PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON COMPUTER RECOGNITION SYSTEMS CORES 2013, 2013, 226 :269-278
[10]   Neighbourhood sampling in bagging for imbalanced data [J].
Blaszczynski, Jerzy ;
Stefanowski, Jerzy .
NEUROCOMPUTING, 2015, 150 :529-542