Efficient kernel discriminant analysis via spectral regression

被引:67
作者
Cai, Deng
He, Xiaofei
Han, Jiawei
机构
来源
ICDM 2007: PROCEEDINGS OF THE SEVENTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING | 2007年
关键词
D O I
10.1109/ICDM.2007.88
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Linear Discriminant Analysis (LDA) has been a popular method for extracting features which preserve class separability The projection vectors are commonly obtained by maximizing the between class covariance and simultaneously minimizing the within class covariance. LDA can be performed either in the original input space or in the reproducing kernel Hilbert space (RKHS) into which data points are mapped, which leads to Kernel Discriminant Analysis (KDA). When the data are highly nonlinear distributed, KDA can achieve better performance than LDA. However computing the projective functions in KDA involves eigen-decomposition of kernel matrix, which is very expensive when a large number of training samples exist. In this paper, we present a new algorithm for kernel discriminant analysis, called Spectral Regression Kernel Discriminant Analysis (SRKDA). By using spectral graph analysis, SRKDA casts discriminant analysis into a regression framework which facilitates both efficient computation and the use of regularization techniques. Specifically, SRKDA only needs to solve a set of regularized regression problems and there is no eigenvector computation involved, which is a huge save of computational cost. Our computational analysis shows that SRKDA is 27 times faster than the ordinary KDA. Moreover the new formulation makes it very easy to develop incremental version of the algorithm which can fully utilize the computational results of the existing training samples. Experiments on face recognition demonstrate the effectiveness and efficiency of the proposed algorithm.
引用
收藏
页码:427 / 432
页数:6
相关论文
共 14 条
  • [1] [Anonymous], 1999, NEURAL NETWORKS SIGN
  • [2] Generalized discriminant analysis using a kernel approach
    Baudat, G
    Anouar, FE
    [J]. NEURAL COMPUTATION, 2000, 12 (10) : 2385 - 2404
  • [3] CAI D, 2007, UIUCDCSR20072857
  • [4] CAI D, 2007, UIUCDCSR20072888
  • [5] LIBSVM: A Library for Support Vector Machines
    Chang, Chih-Chung
    Lin, Chih-Jen
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
  • [6] Fukunaga K., 1990, INTRO STAT PATTERN R
  • [7] Golub G. H., 2012, Matrix Computations, V4th
  • [8] Micchelli Charles A., 1986, S APPL MATH NEW YORK, Vvol. 36, P81
  • [9] MIKA S, 2001, P AISTATS 2001
  • [10] Scholkopf B., 2001, Learning with kernels: support vector machines, regularization, optimization, and beyond