Improving Random Forest and Rotation Forest for highly imbalanced datasets

被引:42
作者
Su, Chong [1 ,2 ]
Ju, Shenggen [1 ]
Liu, Yiguang [1 ]
Yu, Zhonghua [1 ]
机构
[1] Sichuan Univ, Dept Comp, Chengdu 610065, Sichuan, Peoples R China
[2] Nanjing Jiangbei Peoples Hosp, Informat Ctr, Nanjing, Jiangsu, Peoples R China
关键词
Random Forest; Rotation Forest; Hellinger distance; Hellinger distance decision tree (HDDT); highly imbalanced datasets; STATISTICAL COMPARISONS; CLASSIFICATION; CLASSIFIERS;
D O I
10.3233/IDA-150789
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Decision tree is a simple and effective method and it can be supplemented with ensemble methods to improve its performance. Random Forest and Rotation Forest are two approaches which are perceived as "classic" at present. They can build more accurate and diverse classifiers than Bagging and Boosting by introducing the diversities namely randomly chosen a subset of features or rotated feature space. However, the splitting criteria used for constructing each tree in Random Forest and Rotation Forest are Gini index and information gain ratio respectively, which are skew-sensitive. When learning from highly imbalanced datasets, class imbalance impedes their ability to learn the minority class concept. Hellinger distance decision tree (HDDT) was proposed by Chawla, which is skew-insensitive. Especially, bagged unpruned HDDT has proven to be an effective way to deal with highly imbalanced problem. Nevertheless, the bootstrap sampling used in Bagging can lead to ensembles of low diversity compared to Random Forest and Rotation Forest. In order to combine the skew-insensitivity of HDDT and the diversities of Random Forest and Rotation Forest, we use Hellinger distance as the splitting criterion for building each tree in Random Forest and Rotation Forest respectively. An experimental framework is performed across a wide range of highly imbalanced datasets to investigate the effectiveness of Hellinger distance, information gain ratio and Gini index which are used as the splitting criteria in ensembles of decision trees including Bagging, Boosting, Random Forest and Rotation Forest. In addition, Balanced Random Forest is also included in the experiment since it is designed to tackle class imbalance problem. The experimental results, which contrasted through nonparametric statistical tests, demonstrate that using Hellinger distance as the splitting criterion to build individual decision tree in forest can improve the performances of Random Forest and Rotation Forest for highly imbalanced classification.
引用
收藏
页码:1409 / 1432
页数:24
相关论文
共 50 条
[21]   Comparison of Sampling Methods for Imbalanced Data Classification in Random Forest [J].
Paing, May Phu ;
Pintavirooj, C. ;
Tungjitkusolmun, Supan ;
Choomchuay, Somsak ;
Hamamoto, Kazuhiko .
2018 11TH BIOMEDICAL ENGINEERING INTERNATIONAL CONFERENCE (BMEICON 2018), 2018,
[22]   Hyperspectral Imbalanced Datasets Classification Using Filter-Based Forest Methods [J].
Khosravi, Iman ;
Jouybari-Moghaddam, Yaser .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2019, 12 (12) :4766-4772
[23]   Selecting Relevant Genes From Microarray Datasets Using a Random Forest Model [J].
Xia, Hui ;
Akay, Yasemin M. ;
Akay, Metin .
IEEE ACCESS, 2021, 9 :97813-97821
[24]   A new rotation forest ensemble algorithm [J].
Wen, Chenglin ;
Huai, Tingting ;
Zhang, Qinghua ;
Song, Zhihuan ;
Cao, Feilong .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (11) :3569-3576
[25]   Performance Evaluation of the GIS-Based Data-Mining Techniques Decision Tree, Random Forest, and Rotation Forest for Landslide Susceptibility Modeling [J].
Park, Soyoung ;
Hamm, Se-Yeong ;
Kim, Jinsoo .
SUSTAINABILITY, 2019, 11 (20)
[26]   In silico prediction of toxic action mechanisms of phenols for imbalanced data with Random Forest learner [J].
Chen, Jing ;
Tang, Yuan Yan ;
Fang, Bin ;
Guo, Chang .
JOURNAL OF MOLECULAR GRAPHICS & MODELLING, 2012, 35 :21-27
[27]   Improving random forest algorithm by Lasso method [J].
Wang, Hui ;
Wang, Guizhi .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2021, 91 (02) :353-367
[28]   Improving protein fold recognition by random forest [J].
Taeho Jo ;
Jianlin Cheng .
BMC Bioinformatics, 15
[29]   Improving protein fold recognition by random forest [J].
Jo, Taeho ;
Cheng, Jianlin .
BMC BIOINFORMATICS, 2014, 15
[30]   A novel Random Forest integrated model for imbalanced data classification problem [J].
Gu, Qinghua ;
Tian, Jingni ;
Li, Xuexian ;
Jiang, Song .
KNOWLEDGE-BASED SYSTEMS, 2022, 250