Ensemble Fuzzy Feature Selection Based on Relevancy, Redundancy, and Dependency Criteria

被引:11
作者
Salem, Omar A. M. [1 ,2 ]
Liu, Feng [1 ]
Chen, Yi-Ping Phoebe [3 ]
Chen, Xi [1 ]
机构
[1] Wuhan Univ, Sch Comp Sci, Wuhan 430072, Peoples R China
[2] Suez Canal Univ, Fac Comp & Informat, Dept Informat Syst, Ismailia 41522, Egypt
[3] La Trobe Univ, Dept Comp Sci & Informat Technol, Melbourne, Vic 3086, Australia
关键词
feature selection; fuzzy sets; mutual information; rough set; STABLE FEATURE-SELECTION; INPUT FEATURE-SELECTION; MUTUAL INFORMATION; MAX-RELEVANCE; ROUGH SETS; REDUCTION; PERFORMANCE;
D O I
10.3390/e22070757
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The main challenge of classification systems is the processing of undesirable data. Filter-based feature selection is an effective solution to improve the performance of classification systems by selecting the significant features and discarding the undesirable ones. The success of this solution depends on the extracted information from data characteristics. For this reason, many research theories have been introduced to extract different feature relations. Unfortunately, traditional feature selection methods estimate the feature significance based on either individually or dependency discriminative ability. This paper introduces a new ensemble feature selection, called fuzzy feature selection based on relevancy, redundancy, and dependency (FFS-RRD). The proposed method considers both individually and dependency discriminative ability to extract all possible feature relations. To evaluate the proposed method, experimental comparisons are conducted with eight state-of-the-art and conventional feature selection methods. Based on 13 benchmark datasets, the experimental results over four well-known classifiers show the outperformance of our proposed method in terms of classification performance and stability.
引用
收藏
页数:17
相关论文
共 52 条
[1]  
[Anonymous], 2007, ARTIFICIAL INTELLIGE
[2]   USING MUTUAL INFORMATION FOR SELECTING FEATURES IN SUPERVISED NEURAL-NET LEARNING [J].
BATTITI, R .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (04) :537-550
[3]   Feature selection using Joint Mutual Information Maximisation [J].
Bennasar, Mohamed ;
Hicks, Yulia ;
Setchi, Rossitza .
EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (22) :8520-8532
[4]   Ensembles for feature selection: A review and future trends [J].
Bolon-Canedo, Veronica ;
Alonso-Betanzos, Amparo .
INFORMATION FUSION, 2019, 52 :1-12
[5]  
Bonev B.I., 2010, Feature Selection based on information theory
[6]   Feature selection algorithms using Rough Set Theory [J].
Caballero, Yail ;
Alvarez, Delia ;
Bel, Rafael ;
Garcia, Maria M. .
PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, 2007, :407-411
[7]   Maximum relevance minimum common redundancy feature selection for nonlinear data [J].
Che, Jinxing ;
Yang, Youlong ;
Li, Li ;
Bai, Xuying ;
Zhang, Shenghu ;
Deng, Chengzhi .
INFORMATION SCIENCES, 2017, 409 :68-86
[8]   CLASS-DEPENDENT DISCRETIZATION FOR INDUCTIVE LEARNING FROM CONTINUOUS AND MIXED-MODE DATA [J].
CHING, JY ;
WONG, AKC ;
CHAN, KCC .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1995, 17 (07) :641-651
[9]   Rough set-aided keyword reduction for text categorization [J].
Chouchoulas, A ;
Shen, Q .
APPLIED ARTIFICIAL INTELLIGENCE, 2001, 15 (09) :843-873
[10]  
Dougherty J., 1995, Machine Learning. Proceedings of the Twelfth International Conference on Machine Learning, P194