MFSJMI: Multi-label feature selection considering join mutual information and interaction weight

被引:35
|
作者
Zhang, Ping [1 ]
Liu, Guixia [2 ,3 ]
Song, Jiazhi [2 ,3 ]
机构
[1] Hebei Univ Technol, Sch Artificial Intelligence, Tianjin 300401, Peoples R China
[2] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Peoples R China
[3] Jilin Univ, Key Lab Symbol Computat & Knowledge Engn, Minist Educ, Changchun 130012, Peoples R China
关键词
Multi-label learning; Multi-label feature selection; Information theory; Underlying assumptions; STREAMING FEATURE-SELECTION; CLASSIFICATION; RELIEFF;
D O I
10.1016/j.patcog.2023.109378
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label feature selection captures a reliable and informative feature subset from high-dimensional multi-label data, which plays an important role in pattern recognition. In conventional information-theoretical based multi-label feature selection methods, the high-order feature relevance between feature and label set is evaluated using low-order mutual information. However, existing methods do not estab-lish the theoretical basis for the low-order approximation. To fill this gap, we first identify two underlying assumptions based on high-order label distribution: Label Independence Assumption (LIA) and Paired-label Independence Assumption (PIA). Second, we systematically analyze the strengths and weaknesses of two assumptions and introduce joint mutual information to satisfy more realistic label distribution. Furthermore, by decomposing joint mutual information, an interaction weight is proposed to consider multiple label correlations. Finally, a new method considering join mutual information and interaction weight is proposed. Comprehensive experiments demonstrate the effectiveness of the proposed method on various evaluation metrics. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Multi-Label Feature Selection with Conditional Mutual Information
    Wang, Xiujuan
    Zhou, Yuchen
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [2] Approximating mutual information for multi-label feature selection
    Lee, J.
    Lim, H.
    Kim, D. -W.
    ELECTRONICS LETTERS, 2012, 48 (15) : 929 - 930
  • [3] Mutual Information-based multi-label feature selection using interaction information
    Lee, Jaesung
    Kim, Dae-Won
    EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (04) : 2013 - 2025
  • [4] Multi-label feature selection based on neighborhood mutual information
    Lin, Yaojin
    Hu, Qinghua
    Liu, Jinghua
    Chen, Jinkun
    Duan, Jie
    APPLIED SOFT COMPUTING, 2016, 38 : 244 - 256
  • [5] Granular multi-label feature selection based on mutual information
    Li, Feng
    Miao, Duoqian
    Pedrycz, Witold
    PATTERN RECOGNITION, 2017, 67 : 410 - 423
  • [6] Feature-specific mutual information variation for multi-label feature selection
    Hu, Liang
    Gao, Lingbo
    Li, Yonghao
    Zhang, Ping
    Gao, Wanfu
    INFORMATION SCIENCES, 2022, 593 : 449 - 471
  • [7] Multi-label feature selection based on minimizing feature redundancy of mutual information
    Zhou, Gaozhi
    Li, Runxin
    Shang, Zhenhong
    Li, Xiaowu
    Jia, Lianyin
    NEUROCOMPUTING, 2024, 607
  • [8] Multi-label causal feature selection based on neighbourhood mutual information
    Wang, Jie
    Lin, Yaojin
    Li, Longzhu
    Wang, Yun-an
    Xu, Meiyan
    Chen, Jinkun
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (11) : 3509 - 3522
  • [9] Feature Selection for Multi-label Learning Using Mutual Information and GA
    Yu, Ying
    Wang, Yinglong
    ROUGH SETS AND KNOWLEDGE TECHNOLOGY, RSKT 2014, 2014, 8818 : 454 - 463
  • [10] Multi-label causal feature selection based on neighbourhood mutual information
    Jie Wang
    Yaojin Lin
    Longzhu Li
    Yun-an Wang
    Meiyan Xu
    Jinkun Chen
    International Journal of Machine Learning and Cybernetics, 2022, 13 : 3509 - 3522