A pre-clustering technique for optimizing subclass discriminant analysis

被引:14
作者
Kim, Sang-Woon [1 ]
机构
[1] Myongji Univ, Dept Comp Sci & Engn, Yongin 449728, South Korea
关键词
Dimensionality reduction; Subclass discriminant analysis (SDA); Linear discriminant analysis (LDA); Pre-clustering technique; LINEAR DIMENSIONALITY REDUCTION; FACE RECOGNITION; LDA;
D O I
10.1016/j.patrec.2009.07.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Subclass discriminant analysis (SDA) [Zhu. M, Martinez, A M., 2006. Subclass discriminant analysis IEEE Trans. Pattern Anal Machine Intell., 28(8), pp 1274-1286] is a dimensionality reduction method that has proven successful for different types of class distributions In SDA, the reduction of dimensionality is not achieved by assuming that each class is represented by a single cluster, but rather by approximating the underlying distribution with a mixture of Gaussians The advantage of SDA is that since it does not treat the class-conditional distributions as uni-modal ones, the nonlinearly separable problems can be handled as linear ones. The problem with this strategy, however, is that to estimate the number of subclasses needed to represent the distribution of each class, i e, to find out the best partition, all possible solutions should be verified. Therefore, this approach leads to an associated high computational cost In this paper, we propose a method that optimizes the computational burden of SDA-based classification by simply reducing the number of classes to be examined through choosing a few classes of the training set prior to the execution of the SDA. To select the classes to be partitioned, the intra-set distance is employed as a criterion and a k-means clustering is performed to divide them. Our experimental results for an artificial data set of XOR-type samples and three benchmark image databases of Kimia, AT&T, and Yale demonstrate that the processing CPU-time of the SDA optimized with the proposed scheme could be reduced dramatically without either sacrificing classification accuracy or increasing computational complexity. (C) 2009 Elsevier B.V All rights reserved.
引用
收藏
页码:462 / 468
页数:7
相关论文
共 27 条
  • [1] Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection
    Belhumeur, PN
    Hespanha, JP
    Kriegman, DJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) : 711 - 720
  • [2] Discriminative common vectors for face recognition
    Cevikalp, H
    Neamtu, M
    Wilkes, M
    Barkana, A
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (01) : 4 - 13
  • [3] CHEN HT, 2000, P IEEE COMP SOC C CO, V2, P846
  • [4] Regularized discriminant analysis and its application to face recognition
    Dai, DQ
    Yuen, PC
    [J]. PATTERN RECOGNITION, 2003, 36 (03) : 845 - 847
  • [5] How many clusters? Which clustering method? Answers via model-based cluster analysis
    Fraley, C
    Raftery, AE
    [J]. COMPUTER JOURNAL, 1998, 41 (08) : 578 - 588
  • [6] Friedman M., 1999, Introduction to Pattern Recognition. Statistical, Structural
  • [7] Fukunaga K, 1990, INTRO STAT PATTERN R, V2nd
  • [8] The anatomy of a context-aware application
    Harter, A
    Hopper, A
    Steggles, P
    Ward, A
    Webster, P
    [J]. WIRELESS NETWORKS, 2002, 8 (2-3) : 187 - 197
  • [9] Parallel computing of eigenvalue of doubly stochastic matrix
    He, DK
    Wang, JB
    [J]. FIFTH INTERNATIONAL CONFERENCE ON ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, PROCEEDINGS, 2002, : 355 - 358
  • [10] Solving the small sample size problem in face recognition using generalized discriminant analysis
    Howland, P
    Wang, JL
    Park, H
    [J]. PATTERN RECOGNITION, 2006, 39 (02) : 277 - 287