TDMO: Dynamic multi-dimensional oversampling for exploring data distribution based on extreme gradient boosting learning

被引:7
作者
Jia, Liyan [1 ]
Wang, Zhiping [1 ]
Sun, Pengfei [1 ]
Xu, Zhaohui [2 ]
Yang, Sibo [1 ]
机构
[1] Dalian Maritime Univ, Sch Sci, Dalian 116000, Peoples R China
[2] Dalian Med Univ, Affiliated Hosp 1, Clin Lab Dept, Dalian 116011, Peoples R China
关键词
Class imbalance learning; Data distribution; Oversampling; k -nearest neighbors; SMOTE; RE-SAMPLING METHOD; SMOTE; CLASSIFICATION; MODEL; SVM;
D O I
10.1016/j.ins.2023.119621
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The synthetic minority oversampling technique (SMOTE) is the most general and popular solution for imbalanced data. Although SMOTE is effective in solving the class imbalance problem in most cases, it insufficiently exploits the data prior distribution. Additionally, most existing SMOTE variants randomly produce new instances between a minority sample and its nearest neighbors, which carries the risk of noise propagation. To address this, in this paper, local distribution trust estimation based on extreme gradient boosting (XGBoost) and dynamic multi-dimensional oversampling (TDMO) is proposed as a novel approach to exploring data distributions. First, undersampling and XGBoost techniques are introduced to train multiple balanced subsets to identify the internal structure of the original data and obtain the classification prediction accuracy of each instance, called the confidence level (CL). Then, instances with low CL (i.e., noise) are filtered out, and the densities of the two classes in the neighborhood of the non-noise instances are evaluated to create candidate samples to expand the diversity of the minority class. Finally, the minority class is enhanced by combining multiple samples in a multi-dimensional feature space. Extensive experimental results demonstrate that TDMO outperformed the comparative oversampling methods clearly and obtained the optimal classification results.
引用
收藏
页数:36
相关论文
共 50 条
[11]   RSMOTE: A self-adaptive robust SMOTE for imbalanced problems with label noise [J].
Chen, Baiyun ;
Xia, Shuyin ;
Chen, Zizhong ;
Wang, Binggui ;
Wang, Guoyin .
INFORMATION SCIENCES, 2021, 553 :397-428
[12]   Double-kernelized weighted broad learning system for imbalanced data [J].
Chen, Wuxing ;
Yang, Kaixiang ;
Zhang, Weiwen ;
Shi, Yifan ;
Yu, Zhiwen .
NEURAL COMPUTING & APPLICATIONS, 2022, 34 (22) :19923-19936
[13]   Grouped SMOTE With Noise Filtering Mechanism for Classifying Imbalanced Data [J].
Cheng, Ke ;
Zhang, Chen ;
Yu, Hualong ;
Yang, Xibei ;
Zou, Haitao ;
Gao, Shang .
IEEE ACCESS, 2019, 7 :170668-170681
[14]   Geometric SMOTE a geometrically enhanced drop-in replacement for SMOTE [J].
Douzas, Georgios ;
Bacao, Fernando .
INFORMATION SCIENCES, 2019, 501 :118-135
[15]   Improving imbalanced learning through a heuristic oversampling method based on k-means and SMOTE [J].
Douzas, Georgios ;
Bacao, Fernando ;
Last, Felix .
INFORMATION SCIENCES, 2018, 465 :1-20
[16]   Optimal Entropy Genetic Fuzzy-C-Means SMOTE (OEGFCM-SMOTE) [J].
El Moutaouakil, Karim ;
Roudani, Mouhamed ;
El Ouissari, Abdellatif .
KNOWLEDGE-BASED SYSTEMS, 2023, 262
[17]   Preprocessing unbalanced data using support vector machine [J].
Farquad, M. A. H. ;
Bose, Indranil .
DECISION SUPPORT SYSTEMS, 2012, 53 (01) :226-233
[18]   Relative Density-Based Intuitionistic Fuzzy SVM for Class Imbalance Learning [J].
Fu, Cui ;
Zhou, Shuisheng ;
Zhang, Dan ;
Chen, Li .
ENTROPY, 2023, 25 (01)
[19]   Borderline-SMOTE: A new over-sampling method in imbalanced data sets learning [J].
Han, H ;
Wang, WY ;
Mao, BH .
ADVANCES IN INTELLIGENT COMPUTING, PT 1, PROCEEDINGS, 2005, 3644 :878-887
[20]   ADASYN: Adaptive Synthetic Sampling Approach for Imbalanced Learning [J].
He, Haibo ;
Bai, Yang ;
Garcia, Edwardo A. ;
Li, Shutao .
2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, :1322-1328