共 50 条
Constrained class-wise feature selection (CCFS)
被引:2
|作者:
Hussain, Syed Fawad
[1
,2
]
Shahzadi, Fatima
[1
,2
]
Munir, Badre
[1
]
机构:
[1] GIK Inst Engn Sci & Technol, Topi 23460, Khyber Pakhtunk, Pakistan
[2] GIK Inst, Machine Learning & Data Sci Lab MDS, Topi, Pakistan
关键词:
Feature selection;
Information theory;
Classification;
Class-wise feature selection;
MUTUAL INFORMATION;
TEXT CLASSIFICATION;
MACHINE;
D O I:
10.1007/s13042-022-01589-5
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
Feature selection plays a vital role as a preprocessing step for high dimensional data in machine learning. The basic purpose of feature selection is to avoid "curse of dimensionality" and reduce time and space complexity of training data. Several techniques, including those that use information theory, have been proposed in the literature as a means to measure the information content of a feature. Most of them incrementally select features with max dependency with the category but minimum redundancy with already selected features. A key missing idea in these techniques is the fair representation of features with max dependency among the different categories, i.e., skewed selection of features having high mutual information (MI) with a particular class. This can result in a biased classification in favor of that particular class while other classes have low matching scores during classification. We propose a novel approach based on information theory that selects features in a class-wise fashion rather than based on their global max dependency. In addition, a constrained search is used instead of a global sequential forward search. We prove that our proposed approach enhances Maximum Relevance while keeping Minimum Redundancy under a constrained search. Results on multiple benchmark datasets show that our proposed method improves accuracy as compared to other state-of-the-art feature selection algorithms while having a lower time complexity.
引用
收藏
页码:3211 / 3224
页数:14
相关论文