A multi-manifold learning based instance weighting and under-sampling for imbalanced data classification problems

被引:6
作者
Feizi, Tayyebe [1 ]
Moattar, Mohammad Hossein [1 ]
Tabatabaee, Hamid [1 ]
机构
[1] Islamic Azad Univ, Dept Comp Engn, Mashhad Branch, Mashhad, Iran
关键词
Imbalanced data; Classification; Under-sampling; Multi-Manifold learning; REDUCTION ALGORITHM; SMOTE;
D O I
10.1186/s40537-023-00832-2
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Under-sampling is a technique to overcome imbalanced class problem, however, selecting the instances to be dropped and measuring their informativeness is an important concern. This paper tries to bring up a new point of view in this regard and exploit the structure of data to decide on the importance of the data points. For this purpose, a multi-manifold learning approach is proposed. Manifolds represent the underlying structures of data and can help extract the latent space for data distribution. However, there is no evidence that we can rely on a single manifold to extract the local neighborhood of the dataset. Therefore, this paper proposes an ensemble of manifold learning approaches and evaluates each manifold based on an information loss-based heuristic. Having computed the optimality score of each manifold, the centrality and marginality degrees of samples are computed on the manifolds and weighted by the corresponding score. A gradual elimination approach is proposed, which tries to balance the classes while avoiding a drop in the F measure on the validation dataset. The proposed method is evaluated on 22 imbalanced datasets from the KEEL and UCI repositories with different classification measures. The results of the experiments demonstrate that the proposed approach is more effective than other similar approaches and is far better than the previous approaches, especially when the imbalance ratio is very high.
引用
收藏
页数:36
相关论文
共 53 条
[1]  
[Anonymous], 2004, P IRIS MACH LEARN WO
[2]   New applications of ensembles of classifiers [J].
Barandela, R ;
Sánchez, JS ;
Valdovinos, RM .
PATTERN ANALYSIS AND APPLICATIONS, 2003, 6 (03) :245-256
[3]   SMOTE: Synthetic minority over-sampling technique [J].
Chawla, Nitesh V. ;
Bowyer, Kevin W. ;
Hall, Lawrence O. ;
Kegelmeyer, W. Philip .
2002, American Association for Artificial Intelligence (16)
[4]   SMOTEBoost: Improving prediction of the minority class in boosting [J].
Chawla, NV ;
Lazarevic, A ;
Hall, LO ;
Bowyer, KW .
KNOWLEDGE DISCOVERY IN DATABASES: PKDD 2003, PROCEEDINGS, 2003, 2838 :107-119
[5]  
Chen C, 2011, 2011 IEEE INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION (IRI), P384, DOI 10.1109/IRI.2011.6009578
[6]  
Chen H, 2021, ARXIV
[7]  
Deng X., 2016, IPCCC, IEEE, V2016, P1
[8]   Locally alignment based manifold learning for simultaneous feature selection and extraction in classification problems [J].
Fattahi, Mahboubeh ;
Moattar, Mohammad Hossein ;
Forghani, Yahya .
KNOWLEDGE-BASED SYSTEMS, 2023, 259
[9]   Improved cost-sensitive representation of data for solving the imbalanced big data classification problem [J].
Fattahi, Mahboubeh ;
Moattar, Mohammad Hossein ;
Forghani, Yahya .
JOURNAL OF BIG DATA, 2022, 9 (01)
[10]   A Review on Ensembles for the Class Imbalance Problem: Bagging-, Boosting-, and Hybrid-Based Approaches [J].
Galar, Mikel ;
Fernandez, Alberto ;
Barrenechea, Edurne ;
Bustince, Humberto ;
Herrera, Francisco .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2012, 42 (04) :463-484