Non-negative multi-label feature selection with dynamic graph constraints

被引:51
作者
Zhang, Yao [1 ]
Ma, Yingcang [1 ]
机构
[1] Xian Polytech Univ, Sch Sci, Xian 710048, Shaanxi, Peoples R China
关键词
Multi-label learning; Feature selection; Supervised learning; Manifold learning; Laplacian matrix; SUPERVISED LOGISTIC DISCRIMINATION;
D O I
10.1016/j.knosys.2021.107924
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection can combat dimension disasters and improve the performance of classification algorithms, so multi-label feature selection is an essential part of multi-label learning and has attracted widespread attention. Many existing multi-label feature selection methods either do not consider the correlation between labels or directly use logical labels to guide the feature selection process, which leads to the loss of label information. This paper proposes a non-negative multi-label feature selection (NMDG) with dynamic graph constraints to address this issue. In the NMDG model, the original data space is projected into a low-dimensional manifold space by linear regression to construct the pseudo label matrix. The pseudo label matrix has the same topological structure as the original data by combining the non-negative constraints and the label graph matrix. Then, the robust low-dimensional space of the pseudo label matrix is used to construct the dynamic graph matrix, which is combined with the feature manifold to guide the learning of the feature weight matrix. Finally, we design an iterative algorithm based on alternating optimization to solve the proposed method and give convergence proof. Experimental results on ten real multi-label data sets compared with seven representative methods show the effectiveness of the proposed method.(c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 39 条
[1]  
[Anonymous], 2010, P 16 ACM SIGKDD INT
[2]  
Belkin M, 2002, ADV NEUR IN, V14, P585
[3]   Multi-label feature selection via feature manifold learning and sparsity regularization [J].
Cai, Zhiling ;
Zhu, William .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2018, 9 (08) :1321-1334
[4]   Effective Deep Attributed Network Representation Learning With Topology Adapted Smoothing [J].
Chen, Jia ;
Zhong, Ming ;
Li, Jianxin ;
Wang, Dianhui ;
Qian, Tieyun ;
Tu, Hang .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) :5935-5946
[5]   Convex and Semi-Nonnegative Matrix Factorizations [J].
Ding, Chris ;
Li, Tao ;
Jordan, Michael I. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (01) :45-55
[6]   Multi-objective iterative optimization algorithm based optimal wavelet filter selection for multi-fault diagnosis of rolling element bearings [J].
Ding, Chuancang ;
Zhao, Ming ;
Lin, Jing ;
Jiao, Jinyang .
ISA TRANSACTIONS, 2019, 88 :199-215
[7]  
Dougherty J., 1995, Machine Learning. Proceedings of the Twelfth International Conference on Machine Learning, P194
[8]   MULTIPLE COMPARISONS AMONG MEANS [J].
DUNN, OJ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1961, 56 (293) :52-&
[9]   A comparison of alternative tests of significance for the problem of m rankings [J].
Friedman, M .
ANNALS OF MATHEMATICAL STATISTICS, 1940, 11 :86-92
[10]   Multilabel Feature Selection With Constrained Latent Structure Shared Term [J].
Gao, Wanfu ;
Li, Yonghao ;
Hu, Liang .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (03) :1253-1262