Two-stage optimal component analysis

被引:1
作者
Wu, Yiming [1 ]
Liu, Xiuwen [1 ]
Mio, Washington [2 ]
Gallivan, K. A. [3 ]
机构
[1] Florida State Univ, Dept Comp Sci, Tallahassee, FL 32306 USA
[2] Florida State Univ, Dept Math, Tallahassee, FL 32306 USA
[3] Florida State Univ, Sch Comp Sci, Tallahassee, FL 32306 USA
来源
2006 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP 2006, PROCEEDINGS | 2006年
关键词
machine vision; face recognition; image analysis; optimal method; stochastic process;
D O I
10.1109/ICIP.2006.312858
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Linear representations are widely used to reduce dimension in applications involving high dimensional data. While specialized procedures exist for certain optimality criteria, such as principle component analysis (PCA) and Fisher discriminant analysis (FDA), they can not be generalized for more general criteria. To overcome this fundamental limitation, optimal component analysis (OCA) uses a stochastic gradient optimization procedure intrinsic to the manifold giving by the constraints of applications and therefore gives a procedure for finding optimal representations for general criteria. However, due to its generality nature, OCA often requires extensive computation for gradient estimation and updating. To significantly reduce the required computation, in this paper, we propose a two-stage method by first reducing the dimension of input to a smaller one (but larger than the final resulting dimension) using a computationally efficient method and then performing OCA in the reduced space. This reduces the computation time from days to minutes on widely used databases, making OCA learning feasible for many applications. Additionally, since the reduced space is much smaller, the stochastic gradient optimization tends to be more efficient. We illustrate the effectiveness of the proposed method on face classification.
引用
收藏
页码:2041 / +
页数:2
相关论文
共 9 条
[1]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[2]  
Duda R. O., 2000, PATTERN CLASSIFICATI
[3]  
Hyvarinen A., 2001, INDENPENDENT COMPONE
[4]   Optimal linear representations of images for object recognition [J].
Liu, XW ;
Srivastava, A ;
Gallivan, K .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2004, 26 (05) :662-666
[5]   Intrinsic generalization analysis of low dimensional representations [J].
Liu, XW ;
Srivastava, A ;
Wang, DL .
NEURAL NETWORKS, 2003, 16 (5-6) :537-545
[6]   Tools for application-driven linear dimension reduction [J].
Srivastava, A ;
Liu, XW .
NEUROCOMPUTING, 2005, 67 :136-160
[7]   EIGENFACES FOR RECOGNITION [J].
TURK, M ;
PENTLAND, A .
JOURNAL OF COGNITIVE NEUROSCIENCE, 1991, 3 (01) :71-86
[8]  
Vempala S.S, 2004, The random projection method, V65
[9]   A two-stage linear discriminant analysis via QR-decomposition [J].
Ye, JP ;
Li, Q .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (06) :929-941