Discriminative multi-label feature selection with adaptive graph diffusion

被引:25
作者
Ma, Jiajun [1 ]
Xu, Fei [1 ]
Rong, Xiaofeng [1 ]
机构
[1] Xian Technol Univ, Sch Comp Sci & Engn, Xian 710021, Shaanxi, Peoples R China
关键词
Multi-label learning; Feature selection; Adaptive graph diffusion; Sparse regularization;
D O I
10.1016/j.patcog.2023.110154
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection can alleviate the problem of the curse of dimensionality by selecting more discriminative features, which plays an important role in multi-label learning. Recently, embedded feature selection methods have received increasing attentions. However, most existing methods learn the low-dimensional embeddings under the guidance of the local structure between the original instance pairs, thereby ignoring the high-order structure between instances and being sensitive to noise in the original features. To address these issues, we propose a feature selection method named discriminative multi-label feature selection with adaptive graph diffusion (MFS-AGD). Specifically, we first construct a graph embedding learning framework equipped with adaptive graph diffusion to uncover a latent subspace that preserves the higher-order structure information between four tuples. Then, the Hilbert-Schmidt independence criterion (HSIC) is incorporated into the embedding learning framework to ensure the maximum dependency between the latent representation and labels. Benefiting from the interactive optimization of the feature selection matrix, latent representation and similarity graph, the selected features can accurately explore the higher-order structural and supervised information of data. By further considering the correlation between labels, MFS-AG is extended to a more discriminative version,i.e., LMFS-AG. Extensive experimental results on various benchmark data sets validate the advantages of the proposed MFS-AGD and LMFS-AGD methods.
引用
收藏
页数:14
相关论文
共 50 条
[41]   Exploring multi-label feature selection via feature and label information supplementation [J].
Zhang, Suqi ;
Li, Yonghao ;
Zhang, Ping ;
Gao, Wanfu .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 159
[42]   Partial Multi-Label Feature Selection [J].
Wang, Jing ;
Li, Peipei ;
Yu, Kui .
2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
[43]   Extended adaptive Lasso for multi-class and multi-label feature selection [J].
Chen, Si-Bao ;
Zhang, Yu-Mei ;
Ding, Chris H. Q. ;
Zhang, Jian ;
Luo, Bin .
KNOWLEDGE-BASED SYSTEMS, 2019, 173 :28-36
[44]   Sparse Matrix Feature Selection in Multi-label Learning [J].
Yang, Wenyuan ;
Zhou, Bufang ;
Zhu, William .
ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, RSFDGRC 2015, 2015, 9437 :332-339
[45]   Feature selection for multi-label naive Bayes classification [J].
Zhang, Min-Ling ;
Pena, Jose M. ;
Robles, Victor .
INFORMATION SCIENCES, 2009, 179 (19) :3218-3229
[46]   Robust Multi-label Feature Selection with Missing Labels [J].
Xu, Qian ;
Zhu, Pengfei ;
Hu, Qinghua ;
Zhang, Changqing .
PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 :752-765
[47]   Multi-label feature selection with shared common mode [J].
Hu, Liang ;
Li, Yonghao ;
Gao, Wanfu ;
Zhang, Ping ;
Hu, Juncheng .
PATTERN RECOGNITION, 2020, 104
[48]   Categorizing feature selection methods for multi-label classification [J].
Pereira, Rafael B. ;
Plastino, Alexandre ;
Zadrozny, Bianca ;
Merschmann, Luiz H. C. .
ARTIFICIAL INTELLIGENCE REVIEW, 2018, 49 (01) :57-78
[49]   Multi-label Local-to-Global Feature Selection [J].
Zhong, Yan ;
Wu, Xingyu ;
Jiang, Bingbing ;
Chen, Huanhuan .
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
[50]   A Simple and Convex Formulation for Multi-label Feature Selection [J].
Lin, Peng ;
Sun, Zhenqiang ;
Zhang, Jia ;
Luo, Zhiming ;
Li, Shaozi .
COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2019, 2019, 1042 :540-553