Constructing a Nonnegative Low-Rank and Sparse Graph With Data-Adaptive Features

被引:95
作者
Zhuang, Liansheng [1 ]
Gao, Shenghua [2 ]
Tang, Jinhui [3 ]
Wang, Jingjing [1 ]
Lin, Zhouchen [4 ,5 ]
Ma, Yi [2 ]
Yu, Nenghai [1 ]
机构
[1] Unvers Sci & Technol China, Sch Informat Sci & Technol, CAS Key Lab Electromagnet Space Informat, Hefei 230027, Peoples R China
[2] ShanghaiTech Univ, Shanghai 200031, Peoples R China
[3] Nanjing Univ Sci & Technol, Nanjing 210044, Jiangsu, Peoples R China
[4] Peking Univ, Sch Elect Engn & Comp Sci, Key Lab Machine Percept, Minist Educ, Beijing 100871, Peoples R China
[5] Cooperat Medianet Innovat Ctr, Shanghai 201805, Peoples R China
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Graph Construction; low-rank and sparse representation; semi-supervised learning; data embedding; FACE RECOGNITION; REPRESENTATION; EIGENFACES; ALGORITHMS;
D O I
10.1109/TIP.2015.2441632
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims at constructing a good graph to discover the intrinsic data structures under a semisupervised learning setting. First, we propose to build a nonnegative low-rank and sparse (referred to as NNLRS) graph for the given data representation. In particular, the weights of edges in the graph are obtained by seeking a nonnegative low-rank and sparse reconstruction coefficients matrix that represents each data sample as a linear combination of others. The so-obtained NNLRS-graph captures both the global mixture of subspaces structure (by the low-rankness) and the locally linear structure (by the sparseness) of the data, hence it is both generative and discriminative. Second, as good features are extremely important for constructing a good graph, we propose to learn the data embedding matrix and construct the graph simultaneously within one framework, which is termed as NNLRS with embedded features (referred to as NNLRS-EF). Extensive NNLRS experiments on three publicly available data sets demonstrate that the proposed method outperforms the state-of-the-art graph construction method by a large margin for both semisupervised classification and discriminative analysis, which verifies the effectiveness of our proposed method.
引用
收藏
页码:3717 / 3728
页数:12
相关论文
共 41 条
[1]  
[Anonymous], ADV NEURAL INF PROC
[2]  
[Anonymous], 2003, P NEUR INF PROC SYST
[3]  
[Anonymous], 2010, INT C MACH LEARN
[4]  
[Anonymous], 2005, SEMISUPERVISED LEARN
[5]  
[Anonymous], 2007, PROC IEEE INT C COMP
[6]  
[Anonymous], 2011, ACM T INTEL SYST TEC, DOI [DOI 10.1145/1899412.1899418, DOI 10.1016/j.patcog.2003.10.007]
[7]  
[Anonymous], 2006, Advances in Neural Information Processing Systems
[8]  
[Anonymous], 2009, P 26 ANN INT C MACHI, DOI DOI 10.1145/1553374.1553400
[9]  
Azran A., 2007, P 24 INT C MACHINE L, P49
[10]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720