Kernel Sparse Representation-Based Classifier

被引:268
作者
Zhang, Li [1 ]
Zhou, Wei-Da [2 ]
Chang, Pei-Chann [3 ]
Liu, Jing [4 ]
Yan, Zhe [4 ]
Wang, Ting [4 ]
Li, Fan-Zhang [1 ]
机构
[1] Soochow Univ, Res Ctr Machine Learning & Data Anal, Sch Comp Sci & Technol, Suzhou 215006, Jiangsu, Peoples R China
[2] AI Speech Ltd, Suzhou 215123, Jiangsu, Peoples R China
[3] Yuan Ze Univ, Dept Informat Management, Tao Yuan 32026, Taiwan
[4] Xidian Univ, Key Lab Intelligent Percept & Image Understanding, Minist Educ, Xian 710071, Peoples R China
基金
中国国家自然科学基金;
关键词
l(2)-norm; compressed sensing; kernel method; machine learning; sparse representation; SUPPORT VECTOR MACHINES; FACE RECOGNITION; PROJECTION; SELECTION; TUTORIAL;
D O I
10.1109/TSP.2011.2179539
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Sparse representation-based classifier (SRC), a combined result of machine learning and compressed sensing, shows its good classification performance on face image data. However, SRC could not well classify the data with the same direction distribution. The same direction distribution means that the sample vectors belonging to different classes distribute on the same vector direction. This paper presents a new classifier, kernel sparse representation-based classifier (KSRC), based on SRC and the kernel trick which is a usual technique in machine learning. KSRC is a nonlinear extension of SRC and can remedy the drawback of SRC. To make the data in an input space separable, we implicitly map these data into a high-dimensional kernel feature space by using some nonlinear mapping associated with a kernel function. Since this kernel feature space has a very high (or possibly infinite) dimensionality, or is unknown, we have to avoid working in this space explicitly. Fortunately, we can indeed reduce the dimensionality of the kernel feature space by exploiting kernel-based dimensionality reduction methods. In the reduced subspace, we need to find sparse combination coefficients for a test sample and assign a class label to it. Similar to SRC, KSRC is also cast into an l(1)-minimization problem or a quadratically constrained l(1)-minimization problem. Extensive experimental results on UCI and face data sets show KSRC improves the performance of SRC.
引用
收藏
页码:1684 / 1695
页数:12
相关论文
共 49 条
[11]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[12]   An introduction to compressive sampling: A sensing/sampling paradigm that goes against the common knowledge in data acquisition [J].
Candes, Emmanuel J. ;
Wakin, Michael B. .
IEEE Signal Processing Magazine, 2008, 25 (02) :21-30
[13]   Stable signal recovery from incomplete and inaccurate measurements [J].
Candes, Emmanuel J. ;
Romberg, Justin K. ;
Tao, Terence .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2006, 59 (08) :1207-1223
[14]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[15]  
Chen SSB, 2001, SIAM REV, V43, P129, DOI [10.1137/S003614450037906X, 10.1137/S1064827596304010]
[16]   Deterministic constructions of compressed sensing matrices [J].
DeVore, Ronald A. .
JOURNAL OF COMPLEXITY, 2007, 23 (4-6) :918-925
[17]   Compressed sensing [J].
Donoho, DL .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (04) :1289-1306
[18]   Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization [J].
Duarte-Carvajalino, Julio Martin ;
Sapiro, Guillermo .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2009, 18 (07) :1395-1408
[19]   Least angle regression - Rejoinder [J].
Efron, B ;
Hastie, T ;
Johnstone, I ;
Tibshirani, R .
ANNALS OF STATISTICS, 2004, 32 (02) :494-499
[20]   Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems [J].
Figueiredo, Mario A. T. ;
Nowak, Robert D. ;
Wright, Stephen J. .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2007, 1 (04) :586-597