Self-paced Ensemble for Highly Imbalanced Massive Data Classification

被引:139
作者
Liu, Zhining [1 ,2 ]
Cao, Wei [3 ]
Gao, Zhifeng [3 ]
Bian, Jiang [3 ]
Chen, Hechang [1 ,2 ]
Chang, Yi [1 ,2 ]
Liu, Tie-Yan [3 ]
机构
[1] Jilin Univ, Sch Artificial Intelligence, Changchun, Peoples R China
[2] Jilin Univ, Minist Educ, Key Lab Symbol Computat & Knowledge Engn, Changchun, Peoples R China
[3] Microsoft Res, Beijing, Peoples R China
来源
2020 IEEE 36TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2020) | 2020年
基金
中国国家自然科学基金;
关键词
imbalance learning; imbalance classification; ensemble learning; data re-sampling; SMOTE;
D O I
10.1109/ICDE48307.2020.00078
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many real-world applications reveal difficulties in learning classifiers from imbalanced data. The rising big data era has been witnessing more classification tasks with large-scale but extremely imbalance and low-quality datasets. Most of existing learning methods suffer from poor performance or low computation efficiency under such a scenario. To tackle this problem, we conduct deep investigations into the nature of class imbalance, which reveals that not only the disproportion between classes, but also other difficulties embedded in the nature of data, especially, noises and class overlapping, prevent us from learning effective classifiers. Taking those factors into consideration, we propose a novel framework for imbalance classification that aims to generate a strong ensemble by self-paced harmonizing data hardness via under-sampling. Extensive experiments have shown that this new framework, while being very computationally efficient, can lead to robust performance even under highly overlapping classes and extremely skewed distribution. Note that, our methods can be easily adapted to most of existing learning methods (e.g., C4.5, SVM, GBDT and Neural Network) to boost their performance on imbalanced data.
引用
收藏
页码:841 / 852
页数:12
相关论文
共 47 条
[1]  
Alberto F., 2018, LEARNING IMBALANCED
[2]   AN INTRODUCTION TO KERNEL AND NEAREST-NEIGHBOR NONPARAMETRIC REGRESSION [J].
ALTMAN, NS .
AMERICAN STATISTICIAN, 1992, 46 (03) :175-185
[3]  
[Anonymous], 2018, ARXIV PREPRINT ARXIV
[4]  
[Anonymous], 2010, WEB SCALE BAYESIAN C
[5]  
[Anonymous], 2009, NEURAL NETWORKS LEAR
[6]   New applications of ensembles of classifiers [J].
Barandela, R ;
Sánchez, JS ;
Valdovinos, RM .
PATTERN ANALYSIS AND APPLICATIONS, 2003, 6 (03) :245-256
[7]  
Batista G.E., 2004, ACM SIGKDD EXPL NEWS, V6, P20, DOI DOI 10.1145/1007730.1007735
[8]  
Batista GE., 2003, Balancing Training Data for Automated Annotation of Keywords: a Case Study (undefined) Internet
[9]   Optimal classifier for imbalanced data using Matthews Correlation Coefficient metric [J].
Boughorbel, Sabri ;
Jarray, Fethi ;
El-Anbari, Mohammed .
PLOS ONE, 2017, 12 (06)
[10]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32