A General Framework for Class Label Specific Mutual Information Feature Selection Method

被引:10
|
作者
Rakesh, Deepak Kumar [1 ]
Jana, Prasanta K. [1 ]
机构
[1] Indian Inst Technol ISM Dhanbad, Dept Comp Sci & Engn, Dhanbad 826004, Bihar, India
关键词
Mutual information; Feature extraction; Redundancy; Entropy; Magnetic resonance imaging; Information filters; Correlation; Feature selection; filter method; information theory; class label specific mutual information; classification; DEPENDENCY; RELEVANCE; SMOTE;
D O I
10.1109/TIT.2022.3188708
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information theory-based feature selection (ITFS) methods select a single subset of features for all classes based on the following criteria: 1) minimizing redundancy between the selected features and 2) maximizing classification information of the selected features with the classes. A critical issue with selecting a single subset of features is that they may not represent the feature space in which individual class labels can be separated exclusively. Existing methods fail to provide a way to select the feature space specific to the individual class label. To this end, we propose a novel feature selection method called class-label specific mutual information (CSMI) that selects a specific set of features for each class label. The proposed method maximizes the information shared among the selected features and target class label but minimizes the same with all classes. We also consider the dynamic change of information between selected features and the target class label when a candidate feature is added. Finally, we provide a general framework for the CSMI to make it classifier-independent. We perform experiments on sixteen benchmark data sets using four classifiers and found that the CSMI outperforms five traditional, two state-of-the-art ITFS (multi-class classification), and one multi-label classification methods.
引用
收藏
页码:7996 / 8014
页数:19
相关论文
共 50 条
  • [1] Class-specific mutual information variation for feature selection
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    PATTERN RECOGNITION, 2018, 79 : 328 - 339
  • [2] Feature selection with dynamic mutual information
    Liu, Huawen
    Sun, Jigui
    Liu, Lei
    Zhang, Huijie
    PATTERN RECOGNITION, 2009, 42 (07) : 1330 - 1339
  • [3] Input Feature Selection Method Based on Feature Set Equivalence and Mutual Information Gain Maximization
    Wang, Xinzheng
    Guo, Bing
    Shen, Yan
    Zhou, Chimin
    Duan, Xuliang
    IEEE ACCESS, 2019, 7 : 151525 - 151538
  • [4] Multilabel Feature Selection Based on Fuzzy Mutual Information and Orthogonal Regression
    Dai, Jianhua
    Liu, Qi
    Chen, Wenxiang
    Zhang, Chucai
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2024, 32 (09) : 5136 - 5148
  • [5] A review of feature selection methods based on mutual information
    Vergara, Jorge R.
    Estevez, Pablo A.
    NEURAL COMPUTING & APPLICATIONS, 2014, 24 (01) : 175 - 186
  • [6] A Fast Feature Selection Method Based on Mutual Information in Multi-label Learning
    Sun, Zhenqiang
    Zhang, Jia
    Luo, Zhiming
    Cao, Donglin
    Li, Shaozi
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2018, 2019, 917 : 424 - 437
  • [7] Estimating mutual information for feature selection in the presence of label noise
    Frenay, Benoit
    Doquire, Gauthier
    Verleysen, Michel
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 71 : 832 - 848
  • [8] A Feature Subset Selection Method Based On High-Dimensional Mutual Information
    Zheng, Yun
    Kwoh, Chee Keong
    ENTROPY, 2011, 13 (04) : 860 - 901
  • [9] General framework for class-specific feature selection
    Pineda-Bautista, Barbara B.
    Carrasco-Ochoa, J. A.
    Fco Martinez-Trinidad, J.
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (08) : 10018 - 10024
  • [10] Granular multi-label feature selection based on mutual information
    Li, Feng
    Miao, Duoqian
    Pedrycz, Witold
    PATTERN RECOGNITION, 2017, 67 : 410 - 423