Kernel Reverse Neighborhood Discriminant Analysis

被引:1
作者
Li, Wangwang [1 ]
Tan, Hengliang [1 ]
Feng, Jianwei [1 ]
Xie, Ming [1 ]
Du, Jiao [1 ]
Yang, Shuo [1 ]
Yan, Guofeng [1 ]
机构
[1] Guangzhou Univ, Sch Comp Sci & Cyber Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
linear discriminant analysis; kernel trick; reverse nearest neighbors; Gaussian kernel; LDA; EXTRACTION;
D O I
10.3390/electronics12061322
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Currently, neighborhood linear discriminant analysis (nLDA) exploits reverse nearest neighbors (RNN) to avoid the assumption of linear discriminant analysis (LDA) that all samples from the same class should be independently and identically distributed (i.i.d.). nLDA performs well when a dataset contains multimodal classes. However, in complex pattern recognition tasks, such as visual classification, the complex appearance variations caused by deformation, illumination and visual angle often generate non-linearity. Furthermore, it is not easy to separate the multimodal classes in lower-dimensional feature space. One solution to these problems is to map the feature to a higher-dimensional feature space for discriminant learning. Hence, in this paper, we employ kernel functions to map the original data to a higher-dimensional feature space, where the nonlinear multimodal classes can be better classified. We give the details of the deduction of the proposed kernel reverse neighborhood discriminant analysis (KRNDA) with the kernel tricks. The proposed KRNDA outperforms the original nLDA on most datasets of the UCI benchmark database. In high-dimensional visual recognition tasks of handwritten digit recognition, object categorization and face recognition, our KRNDA achieves the best recognition results compared to several sophisticated LDA-based discriminators.
引用
收藏
页数:17
相关论文
共 50 条
[41]   Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination [J].
Louw, N. ;
Steel, S. J. .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2006, 51 (03) :2043-2055
[42]   Uncorrelated discriminant vectors using a kernel method [J].
Liang, ZZ ;
Shi, PF .
PATTERN RECOGNITION, 2005, 38 (02) :307-310
[43]   How to Estimate the Regularization Parameter for Spectral Regression Discriminant Analysis and its Kernel Version? [J].
Gui, Jie ;
Sun, Zhenan ;
Cheng, Jun ;
Ji, Shuiwang ;
Wu, Xindong .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2014, 24 (02) :211-223
[44]   Heteroscedastic Gaussian based Correction term for Fisher Discriminant Analysis and Its Kernel Extension [J].
Yokota, Tatsuya ;
Wakahara, Toru ;
Yamashita, Yukihiko .
2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
[45]   A modification of kernel discriminant analysis for high-dimensional data-with application to face recognition [J].
Zhou, Dake ;
Tang, Zhenmin .
SIGNAL PROCESSING, 2010, 90 (08) :2423-2430
[46]   Dual-kernel based 2D linear discriminant analysis for face recognition [J].
Liu, Xiao-Zhang ;
Ye, Hong-Wei .
JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2015, 6 (05) :557-562
[47]   Dual-kernel based 2D linear discriminant analysis for face recognition [J].
Xiao-Zhang Liu ;
Hong-Wei Ye .
Journal of Ambient Intelligence and Humanized Computing, 2015, 6 :557-562
[48]   Dimensionality Reduction Based on Neighborhood Preserving and Marginal Discriminant Embedding [J].
Lan, Yuan-Dong ;
Deng, Huifang ;
Chen, Tao .
2012 INTERNATIONAL WORKSHOP ON INFORMATION AND ELECTRONICS ENGINEERING, 2012, 29 :494-498
[49]   Unified formulation of linear discriminant analysis methods and optimal parameter selection [J].
An, Senjian ;
Liu, Wanquan ;
Venkatesh, Svetha ;
Yan, Hong .
PATTERN RECOGNITION, 2011, 44 (02) :307-319
[50]   Comparison of regularized discriminant analysis, linear discriminant analysis and quadratic discriminant analysis, applied to NIR data [J].
Wu, W ;
Mallet, Y ;
Walczak, B ;
Penninckx, W ;
Massart, DL ;
Heuerding, S ;
Erni, F .
ANALYTICA CHIMICA ACTA, 1996, 329 (03) :257-265