Local discriminative based sparse subspace learning for feature selection

被引:62
作者
Shang, Ronghua [1 ]
Meng, Yang [1 ]
Wang, Wenbing [1 ]
Shang, Fanhua [1 ]
Jiao, Licheng [1 ]
机构
[1] Xidian Univ, Key Lab Intelligent Percept & Image Understanding, Joint Int Res Lab Intelligent Percept & Computat, Minist Educ,Sch Artificial Intelligence,Int Res C, Xian 710071, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Local discriminant model; Subspace learning; Sparse constraint; Feature selection; NONNEGATIVE MATRIX FACTORIZATION; UNSUPERVISED FEATURE-SELECTION;
D O I
10.1016/j.patcog.2019.03.026
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Subspace learning is a matrix decomposition method. Some algorithms apply subspace learning to feature selection, but they ignore the local discriminative information contained in data. In this paper, we propose a new unsupervised feature selection algorithm to address this issue, which is called local discriminative based sparse subspace learning for feature selection (LDSSL). We first introduce a local discriminant model in our feature selection framework of subspace learning. This model preserves both the local discriminant structure and local geometric structure of the data, simultaneously. It can not only improve the discriminate ability of the algorithm, but also utilize the local geometric structure information contained in data. Local discriminant model is a linear model, which cannot deal with nonlinear data effectively. Therefore, we need to kernelize the local discriminant model to get a nonlinear version. We next introduce the L-1-norm to constrain the feature selection matrix, and this can ensure the sparsity of the feature selection matrix and improve the algorithm's discriminate ability. Then we give the objective function, convergence proof and iterative update rules of the algorithm. We compare LDSSL with eight state-of-the-art algorithms on six datasets. The experimental results show that LDSSL is more effective than eight other feature selection algorithms. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:219 / 230
页数:12
相关论文
共 50 条
[1]  
[Anonymous], PATTERN RECOGN
[2]  
[Anonymous], 2011, IJCAI INT JOINT C AR
[3]  
[Anonymous], IEEE T CYBERN
[4]  
[Anonymous], 2007, P 24 INT C MACH LEAR
[5]  
[Anonymous], 2010, ADV NEURAL INFORM PR
[6]  
Bach FR, 2008, J MACH LEARN RES, V9, P1179
[7]   Unsupervised Feature Selection with Controlled Redundancy (UFeSCoR) [J].
Banerjee, Monami ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (12) :3390-3403
[8]  
Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI [10.1145/1835804.1835848, DOI 10.1145/1835804.1835848]
[9]   Graph Regularized Nonnegative Matrix Factorization for Data Representation [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Huang, Thomas S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) :1548-1560
[10]   Local Kernel Regression Score for Selecting Features of High-Dimensional Data [J].
Cheung, Yiu-ming ;
Zeng, Hong .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2009, 21 (12) :1798-1802