GLEE: A granularity filter for feature selection

被引:14
作者
Ba, Jing [1 ]
Wang, Pingxin [2 ]
Yang, Xibei [1 ,3 ]
Yu, Hualong [1 ]
Yu, Dongjun [4 ]
机构
[1] Jiangsu Univ Sci & Technol, Sch Comp, Zhenjiang 212100, Jiangsu, Peoples R China
[2] Jiangsu Univ Sci & Technol, Sch Sci, Zhenjiang 212003, Jiangsu, Peoples R China
[3] Zhejiang Ocean Univ, Key Lab Oceanog Big Data Min & Applicat Zhejiang P, Zhoushan 316022, Zhejiang, Peoples R China
[4] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Jiangsu, Peoples R China
关键词
Feature selection; GLEE; Granularity filter; Neighborhood; Rough set; ATTRIBUTE REDUCTION; KNOWLEDGE GRANULATION; NEURAL-NETWORKS; EFFICIENT; FUZZY; ACCELERATOR; ENTROPY; SYSTEMS; MODEL;
D O I
10.1016/j.engappai.2023.106080
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the field of Granular Computing (GrC), feature selection is an attractive task. Some basics of GrC such as information granulation and granularity have well guided the explorations of feature selection technique. Nevertheless, some challenges to perform feature selection such as high computational costs of repeating information granulation and multi-granularity structure in the process of searching qualified features may also arise. In this study, to improve the efficiency and effectiveness of feature selector, a novel framework named GLEE: Granularity fiLter for fEature sElection, is reported. Firstly, the value of granularity related to each feature is calculated. Secondly, all features are reordered by the corresponding values of granularity. Finally, following such a derived sequence, features will be added into the selection pool one by one until the termination condition is achieved. GLEE can not only eliminate the iterative calculations of information granulation in the whole process of selecting, but also provide sequence of features which may be insensitive to data perturbation. It should also be emphasized that GLEE is a general framework, and most existing termination conditions to feature selection can be embedded into it. To validate the effectiveness of GLEE, it is compared with several well-established feature selection schemes in elapsed time of selecting features, stability of selected features and classification performance. Experimental results over 20 UCI datasets with both raw features and 3 different ratios of noisy features demonstrate that our framework is superior as it yields better robustness with satisfactory elapsed time.
引用
收藏
页数:13
相关论文
共 59 条
[1]   An information theoretic approach to quantify the stability of feature selection and ranking algorithms [J].
Alaiz-Rodriguez, Rocio ;
Parnell, Andrew C. .
KNOWLEDGE-BASED SYSTEMS, 2020, 195 (195)
[2]   Quantum circuit compilation by genetic algorithm for quantum approximate optimization algorithm applied to MaxCut problem [J].
Arufe, Lis ;
Gonzalez, Miguel A. ;
Oddi, Angelo ;
Rasconi, Riccardo ;
Varela, Ramiro .
SWARM AND EVOLUTIONARY COMPUTATION, 2022, 69
[3]   Triple-G: a new MGRS and attribute reduction [J].
Ba, Jing ;
Liu, Keyu ;
Ju, Hengrong ;
Xu, Suping ;
Xu, Taihua ;
Yang, Xibei .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (02) :337-356
[4]   Discriminative Spectral-Spatial Feature Extraction-Based Band Selection for Hyperspectral Image Classification [J].
Baisantry, Munmun ;
Sao, Anil Kumar ;
Shukla, Dericks Praise .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
[5]   Data quality measures based on granular computing for multi-label classification [J].
Bello, Marilyn ;
Napoles, Gonzalo ;
Vanhoof, Koen ;
Bello, Rafael .
INFORMATION SCIENCES, 2021, 560 :51-67
[6]   A probabilistic learning algorithm for robust modeling using neural networks with random weights [J].
Cao, Feilong ;
Ye, Hailiang ;
Wang, Dianhui .
INFORMATION SCIENCES, 2015, 313 :62-78
[7]   Label correlation in multi-label classification using local attribute reductions with fuzzy rough sets [J].
Che, Xiaoya ;
Chen, Degang ;
Mi, Jusheng .
FUZZY SETS AND SYSTEMS, 2022, 426 :121-144
[8]   Granular ball guided selector for attribute reduction [J].
Chen, Yan ;
Wang, Pingxin ;
Yang, Xibei ;
Mi, Jusheng ;
Liu, Dun .
KNOWLEDGE-BASED SYSTEMS, 2021, 229
[9]   Random sampling accelerator for attribute reduction [J].
Chen, Zhen ;
Liu, Keyu ;
Yang, Xibei ;
Fujita, Hamido .
INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2022, 140 :75-91
[10]   Multigranulation Supertrust Model for Attribute Reduction [J].
Ding, Weiping ;
Pedrycz, Witold ;
Triguero, Isaac ;
Cao, Zehong ;
Lin, Chin-Teng .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2021, 29 (06) :1395-1408