Parallel Selector for Feature Reduction

被引:1
作者
Yin, Zhenyu [1 ]
Fan, Yan [1 ]
Wang, Pingxin [2 ]
Chen, Jianjun [1 ]
机构
[1] Jiangsu Univ Sci & Technol, Sch Comp, Zhenjiang 212100, Peoples R China
[2] Jiangsu Univ Sci & Technol, Sch Sci, Zhenjiang 212100, Peoples R China
基金
中国国家自然科学基金;
关键词
rough set; feature reduction; feature evaluation; data perturbation; ATTRIBUTE REDUCTION; CONDITIONAL-ENTROPY; NEURAL-NETWORKS; ROUGH; FUZZY; INFORMATION; ACCELERATOR;
D O I
10.3390/math11092084
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In the field of rough set, feature reduction is a hot topic. Up to now, to better guide the explorations of this topic, various devices regarding feature reduction have been developed. Nevertheless, some challenges regarding these devices should not be ignored: (1) the viewpoint provided by a fixed measure is underabundant; (2) the final reduct based on single constraint is sometimes powerless to data perturbation; (3) the efficiency in deriving the final reduct is inferior. In this study, to improve the effectiveness and efficiency of feature reduction algorithms, a novel framework named parallel selector for feature reduction is reported. Firstly, the granularity of raw features is quantitatively characterized. Secondly, based on these granularity values, the raw features are sorted. Thirdly, the reordered features are evaluated again. Finally, following these two evaluations, the reordered features are divided into groups, and the features satisfying given constraints are parallel selected. Our framework can not only guide a relatively stable feature sequencing if data perturbation occurs but can also reduce time consumption for feature reduction. The experimental results over 25 UCI data sets with four different ratios of noisy labels demonstrated the superiority of our framework through a comparison with eight state-of-the-art algorithms.
引用
收藏
页数:33
相关论文
共 74 条
[1]   Conditional Entropy and Data Processing: An Axiomatic Approach Based on Core-Concavity [J].
Americo, Arthur ;
Khouzani, MHR. ;
Malacaria, Pasquale .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (09) :5537-5547
[2]   AN EMPIRICAL DISTRIBUTION FUNCTION FOR SAMPLING WITH INCOMPLETE INFORMATION [J].
AYER, M ;
BRUNK, HD ;
EWING, GM ;
REID, WT ;
SILVERMAN, E .
ANNALS OF MATHEMATICAL STATISTICS, 1955, 26 (04) :641-647
[3]   GLEE: A granularity filter for feature selection [J].
Ba, Jing ;
Wang, Pingxin ;
Yang, Xibei ;
Yu, Hualong ;
Yu, Dongjun .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 122
[4]   Triple-G: a new MGRS and attribute reduction [J].
Ba, Jing ;
Liu, Keyu ;
Ju, Hengrong ;
Xu, Suping ;
Xu, Taihua ;
Yang, Xibei .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (02) :337-356
[5]   A COMPETITIVE (DUAL) SIMPLEX-METHOD FOR THE ASSIGNMENT PROBLEM [J].
BALINSKI, ML .
MATHEMATICAL PROGRAMMING, 1986, 34 (02) :125-141
[6]  
Breiman L., 2017, ROUTLEDGE, V1st, DOI [10.1201/9781315139470, 10.1201/9781315139470-8]
[7]   A probabilistic learning algorithm for robust modeling using neural networks with random weights [J].
Cao, Feilong ;
Ye, Hailiang ;
Wang, Dianhui .
INFORMATION SCIENCES, 2015, 313 :62-78
[8]  
Chen C, 2022, Arxiv, DOI [arXiv:2211.14144, 10.1093/bioinformatics/btad135, DOI 10.1093/BIOINFORMATICS/BTAD135]
[9]  
Chen JB, 2017, ADV NEUR IN, V30
[10]   Fusing attribute reduction accelerators [J].
Chen, Yan ;
Yang, Xibei ;
Li, Jinhai ;
Wang, Pingxin ;
Qian, Yuhua .
INFORMATION SCIENCES, 2022, 587 :354-370