A comparison of generalized linear discriminant analysis algorithms

被引:111
作者
Park, Cheong Hee
Park, Haesun
机构
[1] Chungnam Natl Univ, Dept Comp Sci & Engn, Taejon 305763, South Korea
[2] Coll Comp, Georgia Inst Technol, Atlanta, GA 30332 USA
关键词
dimension reduction; feature extraction; generalized linear discriminant analysis; kernel methods; nonlinear discriminant analysis; undersampled problems;
D O I
10.1016/j.patcog.2007.07.022
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Linear discriminant analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the class separability. However, in undersampled problems where the number of data samples is smaller than the dimension of data space, it is difficult to apply LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make LDA applicable, several generalizations of LDA have been proposed recently. In this paper, we present theoretical and algorithmic relationships among several generalized LDA algorithms and compare their computational complexities and performances in text classification and face recognition. Towards a practical dimension reduction method for high dimensional data, an efficient algorithm is proposed, which reduces the computational complexity greatly while achieving competitive prediction accuracies. We also present nonlinear extensions of these LDA algorithms based on kernel methods. It is shown that a generalized eigenvalue problem can be formulated in the kernel-based feature space, and generalized LDA algorithms are applied to solve the generalized eigenvalue problem, resulting in nonlinear discriminant analysis. Performances of these linear and nonlinear discriminant analysis algorithms are compared extensively. (C) 2007 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:1083 / 1097
页数:15
相关论文
共 25 条
[1]   Generalized discriminant analysis using a kernel approach [J].
Baudat, G ;
Anouar, FE .
NEURAL COMPUTATION, 2000, 12 (10) :2385-2404
[2]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[3]   Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm [J].
Billings, SA ;
Lee, KL .
NEURAL NETWORKS, 2002, 15 (02) :263-270
[4]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[5]   A new LDA-based face recognition system which can solve the small sample size problem [J].
Chen, LF ;
Liao, HYM ;
Ko, MT ;
Lin, JC ;
Yu, GJ .
PATTERN RECOGNITION, 2000, 33 (10) :1713-1726
[6]  
CHRISTIANINI N, 2000, PATTERN RECOGNITION
[7]  
Duda RO, 2006, PATTERN CLASSIFICATI
[8]   REGULARIZED DISCRIMINANT-ANALYSIS [J].
FRIEDMAN, JH .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (405) :165-175
[9]  
Fukunaga K., 1990, INTRO STAT PATTERN R
[10]  
Golub G. H., 1996, MATRIX COMPUTATIONS