A unified low-order information-theoretic feature selection framework for multi-label learning

被引:40
作者
Gao, Wanfu [1 ,2 ]
Hao, Pingting [1 ,2 ]
Wu, Yang [1 ,2 ]
Zhang, Ping [1 ,3 ]
机构
[1] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Peoples R China
[2] Jilin Univ, Minist Educ, Key Lab Symbol Computat & Knowledge Engn, Changchun 130012, Peoples R China
[3] Hebei Univ Technol, Sch Artificial Intelligence, Tianjin 300480, Peoples R China
关键词
Feature selection; Multi-label learning; Information theory; Low-order information-theoretic terms; Probability distribution assumption;
D O I
10.1016/j.patcog.2022.109111
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The approximation of low-order information-theoretic terms for feature selection approaches has achieved success in addressing high-dimensional multi-label data. However, three critical issues exist in such type of approaches: (1) existing approaches are devised based on single heuristic variable correlation assumption, which biases towards some specific scene; (2) high-order variable correlations are ignored by cumulative summation low-order information-theoretic terms; (3) abundant approaches confuse researchers to devise and utilize appropriate approaches. To address these issues, two types of probability distribution assumption in terms of candidate features and labels are derived based on low-order variable correlations. Afterwords, clearing up all information-theoretic terms, we propose a unified feature selection framework including three low-order information-theoretic terms for multi-label learning named Selected Terms of Feature Selection (STFS). STFS contains high-order variable correlations in the form of low-order information-theoretic terms. Furthermore, many previous multi-label feature selection approaches can be reduced to special forms of STFS. Finally, extensive experiments conducted on twelve benchmark data sets in comparison to seven state-of-the-art approaches demonstrate the classification superiority of STFS. (c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:14
相关论文
共 40 条
[1]  
Brown G, 2012, J MACH LEARN RES, V13, P27
[2]   Multi-Label Image Recognition with Graph Convolutional Networks [J].
Chen, Zhao-Min ;
Wei, Xiu-Shen ;
Wang, Peng ;
Guo, Yanwen .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :5172-5181
[3]  
Con J., 2019, IEEE T NEUR NET LEAR
[4]   Fused lasso for feature selection using structural information [J].
Cui, Lixin ;
Bai, Lu ;
Wang, Yue ;
Yu, Philip S. ;
Hancock, Edwin R. .
PATTERN RECOGNITION, 2021, 119
[5]   Internet financing credit risk evaluation using multiple structural interacting elastic net feature selection [J].
Cui, Lixin ;
Bai, Lu ;
Wang, Yanchao ;
Jin, Xin ;
Hancock, Edwin R. .
PATTERN RECOGNITION, 2021, 114 (114)
[6]   Multilabel Feature Selection With Constrained Latent Structure Shared Term [J].
Gao, Wanfu ;
Li, Yonghao ;
Hu, Liang .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (03) :1253-1262
[7]   Distributed multi-label feature selection using individual mutual information measures [J].
Gonzalez-Lopez, Jorge ;
Ventura, Sebastian ;
Cano, Alberto .
KNOWLEDGE-BASED SYSTEMS, 2020, 188
[8]  
Guo J, 2020, J MACH LEARN RES, V21
[9]   Feature-specific mutual information variation for multi-label feature selection [J].
Hu, Liang ;
Gao, Lingbo ;
Li, Yonghao ;
Zhang, Ping ;
Gao, Wanfu .
INFORMATION SCIENCES, 2022, 593 :449-471
[10]   Multi-label feature selection with shared common mode [J].
Hu, Liang ;
Li, Yonghao ;
Gao, Wanfu ;
Zhang, Ping ;
Hu, Juncheng .
PATTERN RECOGNITION, 2020, 104