Discriminative sparse embedding based on adaptive graph for dimension reduction

被引:36
作者
Liu, Zhonghua [1 ]
Shi, Kaiming [1 ]
Zhang, Kaibing [2 ]
Ou, Weihua [3 ]
Wang, Lin [1 ]
机构
[1] Henan Univ Sci & Technol, Informat Engn Coll, Luoyang, Peoples R China
[2] Xian Polytech Univ, Coll Elect & Informat, Xian, Peoples R China
[3] Guizhou Normal Univ, Sch Big Data & Comp Sci, Guiyang, Peoples R China
基金
中国国家自然科学基金;
关键词
Manifold learning; Discriminative sparse embedding; Dimension reduction; Subspace learning; FACE-RECOGNITION; PRESERVING PROJECTIONS; REGRESSION; SELECTION; EIGENFACES;
D O I
10.1016/j.engappai.2020.103758
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The traditional manifold learning methods usually utilize the original observed data to directly define the intrinsic structure among data. Because the original samples often contain a deal of redundant information or it is corrupted by noises, it leads to the unreliability of the obtained intrinsic structure. In addition, the intrinsic structure learning and subspace learning are completely separated. For solving above problems, this paper presents a novel dimension reduction method termed discriminative sparse embedding (DSE) based on adaptive graph. By projecting the original samples into a low-dimensional subspace, DSE learns a sparse weight matrix, which can reduce the effects of redundant information and noises of the original data, and uncover essential structural relationship among the data. In DSE, the robust subspace is learned from the original data. Meanwhile, the intrinsic local structure and the optimal subspace can be simultaneously learned, in which they are mutually improved, and the accurate structure can be captured, and the optimal subspace can be obtained. We propose an alternative and iterative method to solve the DSE model. In order to evaluate the performance of DSE, it is compared with some state-of-the-art feature extraction algorithms. Various experiments show that our DSE is effective and feasible.
引用
收藏
页数:11
相关论文
共 53 条
[1]  
[Anonymous], 20 INT C PATT REC IC
[2]  
[Anonymous], 2011, Advances in Neural Information Processing Systems, DOI DOI 10.5555/2986459.2986528
[3]  
[Anonymous], NEURAL INF PROCESS S
[4]  
[Anonymous], 2007, IJCAI
[5]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[6]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[7]  
Bengio Y, 2004, ADV NEUR IN, V16, P177
[8]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[9]   Semi-supervised double sparse graphs based discriminant analysis for dimensionality reduction [J].
Chen, Puhua ;
Jiao, Licheng ;
Liu, Fang ;
Zhao, Jiaqi ;
Zhao, Zhiqiang ;
Liu, Shuai .
PATTERN RECOGNITION, 2017, 61 :361-378
[10]   Linear regression based projections for dimensionality reduction [J].
Chen, Si-Bao ;
Ding, Chris H. Q. ;
Luo, Bin .
INFORMATION SCIENCES, 2018, 467 :74-86