Evolutionary under-sampling based bagging ensemble method for imbalanced data classification

被引:56
作者
Sun, Bo [1 ,2 ]
Chen, Haiyan [1 ,2 ]
Wang, Jiandong [1 ]
Xie, Hua [2 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 210016, Jiangsu, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, Natl Key Lab ATFM, Nanjing 211106, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
class imbalanced problem; under-sampling; bagging; evolutionary under-sampling; ensemble learning; machine learning; data mining; SUPPORT VECTOR MACHINES; DATA-SETS; SMOTE; CLASSIFIERS; STRATEGIES;
D O I
10.1007/s11704-016-5306-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the class imbalanced learning scenario, traditional machine learning algorithms focusing on optimizing the overall accuracy tend to achieve poor classification performance especially for the minority class in which we are most interested. To solve this problem, many effective approaches have been proposed. Among them, the bagging ensemble methods with integration of the under-sampling techniques have demonstrated better performance than some other ones including the bagging ensemble methods integrated with the over-sampling techniques, the cost-sensitive methods, etc. Although these under-sampling techniques promote the diversity among the generated base classifiers with the help of random partition or sampling for the majority class, they do not take any measure to ensure the individual classification performance, consequently affecting the achievability of better ensemble performance. On the other hand, evolutionary under-sampling EUS as a novel undersampling technique has been successfully applied in searching for the best majority class subset for training a good-performance nearest neighbor classifier. Inspired by EUS, in this paper, we try to introduce it into the under-sampling bagging framework and propose an EUS based bagging ensemble method EUS-Bag by designing a new fitness function considering three factors to make EUS better suited to the framework. With our fitness function, EUS-Bag could generate a set of accurate and diverse base classifiers. To verify the effectiveness of EUS-Bag, we conduct a series of comparison experiments on 22 two-class imbalanced classification problems. Experimental results measured using recall, geometric mean and AUC all demonstrate its superior performance.
引用
收藏
页码:331 / 350
页数:20
相关论文
共 58 条
[21]   Evolutionary Sampling and Software Quality Modeling of High-Assurance Systems [J].
Drown, Dennis J. ;
Khoshgoftaar, Taghi M. ;
Seliya, Naeem .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2009, 39 (05) :1097-1107
[22]   A multiple resampling method for learning from imbalanced data sets [J].
Estabrooks, A ;
Jo, TH ;
Japkowicz, N .
COMPUTATIONAL INTELLIGENCE, 2004, 20 (01) :18-36
[23]   A decision-theoretic generalization of on-line learning and an application to boosting [J].
Freund, Y ;
Schapire, RE .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) :119-139
[24]   EUSBoost: Enhancing ensembles for highly imbalanced data-sets by evolutionary undersampling [J].
Galar, Mikel ;
Fernandez, Alberto ;
Barrenechea, Edurne ;
Herrera, Francisco .
PATTERN RECOGNITION, 2013, 46 (12) :3460-3471
[25]   Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study [J].
Garcia, Salvador ;
Derrac, Joaquin ;
Ramon Cano, Jose ;
Herrera, Francisco .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (03) :417-435
[26]   Evolutionary Undersampling for Classification with Imbalanced Datasets: Proposals and Taxonomy [J].
Garcia, Salvador ;
Herrera, Francisco .
EVOLUTIONARY COMPUTATION, 2009, 17 (03) :275-306
[27]   Learning from Imbalanced Data [J].
He, Haibo ;
Garcia, Edwardo A. .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2009, 21 (09) :1263-1284
[28]   Borderline-SMOTE: A new over-sampling method in imbalanced data sets learning [J].
Han, H ;
Wang, WY ;
Mao, BH .
ADVANCES IN INTELLIGENT COMPUTING, PT 1, PROCEEDINGS, 2005, 3644 :878-887
[29]  
He H, 2013, IMBALANCED LEARNING: FOUNDATIONS, ALGORITHMS, AND APPLICATIONS, P1, DOI 10.1002/9781118646106
[30]  
Hido S., 2009, Statistical Analysis and Data Mining: The ASA Data Science Journal, V2, P412, DOI [DOI 10.1002/SAM.10061, DOI 10.1002/SAM.V2:5/6, https://doi.org/10.1002/sam.v2:5/6]