Information-theoretic partially labeled heterogeneous feature selection based on neighborhood rough sets

被引:14
|
作者
Zhang, Hongying [1 ]
Sun, Qianqian [1 ]
Dong, Kezhen [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature selection; Monotonic entropy; Partially labeled heterogeneous data; ATTRIBUTE REDUCTION;
D O I
10.1016/j.ijar.2022.12.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid increase of large-scale, real-world datasets, it becomes critical to address the problem of partially labeled heterogeneous feature selection (i.e., some samples, which own numerical and categorical features, have no labels). Existing solutions typically adopt linear correlations between features. In this paper, three different monotonic uncertainty measures are defined on equivalence classes and neighborhood classes to study the partially labeled heterogeneous feature selection by exploring the nonlinear correlations. First, consistent entropy and monotonic neighborhood entropy, based on classical rough set theory and neighborhood rough set theory, are proposed to construct a uniform measure for feature selection in heterogeneous datasets. Furthermore, a maximal neighborhood entropy strategy is developed by considering the inconsistency of neighborhood classes described by the features and partial labels. Finally, two feature selection algorithms are presented by three novel monotonic uncertainty measures. The comparative experiments demonstrate the effectiveness and superiority of the newly proposed feature selection measures.(c) 2022 Elsevier Inc. All rights reserved.
引用
收藏
页码:200 / 217
页数:18
相关论文
共 50 条
  • [21] Information-theoretic feature selection with discrete -median clustering
    Seref, Onur
    Fan, Ya-Ju
    Borenstein, Elan
    Chaovalitwongse, Wanpracha A.
    ANNALS OF OPERATIONS RESEARCH, 2018, 263 (1-2) : 93 - 118
  • [22] An information-theoretic perspective on feature selection in speaker recognition
    Eriksson, T
    Kim, S
    Kang, HG
    Lee, C
    IEEE SIGNAL PROCESSING LETTERS, 2005, 12 (07) : 500 - 503
  • [23] Data poisoning against information-theoretic feature selection
    Liu, Heng
    Ditzler, Gregory
    INFORMATION SCIENCES, 2021, 573 : 396 - 411
  • [24] Attack Transferability Against Information-Theoretic Feature Selection
    Gupta, Srishti
    Golota, Roman
    Ditzler, Gregory
    IEEE ACCESS, 2021, 9 : 115885 - 115894
  • [25] Information-theoretic feature selection for functional data classification
    Gomez-Verdejo, Vanessa
    Verleysen, Michel
    Fleury, Jerome
    NEUROCOMPUTING, 2009, 72 (16-18) : 3580 - 3589
  • [26] Multi-label feature selection based on fuzzy neighborhood rough sets
    Xu, Jiucheng
    Shen, Kaili
    Sun, Lin
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (03) : 2105 - 2129
  • [27] Feature selection for multi-label classification based on neighborhood rough sets
    Duan, Jie
    Hu, Qinghua
    Zhang, Lingjun
    Qian, Yuhua
    Li, Deyu
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2015, 52 (01): : 56 - 65
  • [28] Maximum relevance minimum redundancy-based feature selection using rough mutual information in adaptive neighborhood rough sets
    Kanglin Qu
    Jiucheng Xu
    Ziqin Han
    Shihui Xu
    Applied Intelligence, 2023, 53 : 17727 - 17746
  • [29] An incremental approach to feature selection using the weighted dominance-based neighborhood rough sets
    Pan, Yanzhou
    Xu, Weihua
    Ran, Qinwen
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (04) : 1217 - 1233
  • [30] Mixed measure-based feature selection using the Fisher score and neighborhood rough sets
    Sun, Lin
    Zhang, Jiuxiao
    Ding, Weiping
    Xu, Jiucheng
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17264 - 17288