Kernel Sparse Representation-Based Classifier

被引:268
作者
Zhang, Li [1 ]
Zhou, Wei-Da [2 ]
Chang, Pei-Chann [3 ]
Liu, Jing [4 ]
Yan, Zhe [4 ]
Wang, Ting [4 ]
Li, Fan-Zhang [1 ]
机构
[1] Soochow Univ, Res Ctr Machine Learning & Data Anal, Sch Comp Sci & Technol, Suzhou 215006, Jiangsu, Peoples R China
[2] AI Speech Ltd, Suzhou 215123, Jiangsu, Peoples R China
[3] Yuan Ze Univ, Dept Informat Management, Tao Yuan 32026, Taiwan
[4] Xidian Univ, Key Lab Intelligent Percept & Image Understanding, Minist Educ, Xian 710071, Peoples R China
基金
中国国家自然科学基金;
关键词
l(2)-norm; compressed sensing; kernel method; machine learning; sparse representation; SUPPORT VECTOR MACHINES; FACE RECOGNITION; PROJECTION; SELECTION; TUTORIAL;
D O I
10.1109/TSP.2011.2179539
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Sparse representation-based classifier (SRC), a combined result of machine learning and compressed sensing, shows its good classification performance on face image data. However, SRC could not well classify the data with the same direction distribution. The same direction distribution means that the sample vectors belonging to different classes distribute on the same vector direction. This paper presents a new classifier, kernel sparse representation-based classifier (KSRC), based on SRC and the kernel trick which is a usual technique in machine learning. KSRC is a nonlinear extension of SRC and can remedy the drawback of SRC. To make the data in an input space separable, we implicitly map these data into a high-dimensional kernel feature space by using some nonlinear mapping associated with a kernel function. Since this kernel feature space has a very high (or possibly infinite) dimensionality, or is unknown, we have to avoid working in this space explicitly. Fortunately, we can indeed reduce the dimensionality of the kernel feature space by exploiting kernel-based dimensionality reduction methods. In the reduced subspace, we need to find sparse combination coefficients for a test sample and assign a class label to it. Similar to SRC, KSRC is also cast into an l(1)-minimization problem or a quadratically constrained l(1)-minimization problem. Extensive experimental results on UCI and face data sets show KSRC improves the performance of SRC.
引用
收藏
页码:1684 / 1695
页数:12
相关论文
共 49 条
[1]  
[Anonymous], 1988, PITMAN RES NOTES MAT
[2]  
[Anonymous], THESIS STANFORD U ST
[3]  
[Anonymous], 1973, Pattern Classification and Scene Analysis
[4]   THEORY OF REPRODUCING KERNELS [J].
ARONSZAJN, N .
TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY, 1950, 68 (MAY) :337-404
[5]   A Simple Proof of the Restricted Isometry Property for Random Matrices [J].
Baraniuk, Richard ;
Davenport, Mark ;
DeVore, Ronald ;
Wakin, Michael .
CONSTRUCTIVE APPROXIMATION, 2008, 28 (03) :253-263
[6]   Approximation and learning by greedy algorithms [J].
Barron, Andrew R. ;
Cohen, Albert ;
Dahmen, Wolfgang ;
DeVore, Ronald A. .
ANNALS OF STATISTICS, 2008, 36 (01) :64-94
[7]  
Becker S., 2009, NESTA FAST ACCURATE
[8]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[9]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[10]  
CANDES E., 2005, l1-magic: Recovery of sparse signals via convex programming