First-order approximation of Gram-Schmidt orthonormalization beats deflation in coupled PCA learning rules

被引:4
作者
Moeller, Ralf [1 ]
机构
[1] Univ Bielefeld, Fac Technol, Comp Engn Grp, D-33954 Bielefeld, Germany
关键词
principal component analysis; coupled learning rules; orthonormalization; Gram-Schmidt method; deflation;
D O I
10.1016/j.neucom.2005.06.016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In coupled. learning rules for principal component analysis, eigenvectors and eigenvalues are simultaneously estimated in a coupled system of equations. Coupled single-neuron rules have favorable convergence properties. For the estimation of multiple eigenvectors, orthonormalization methods have to be applied, either full Gram-Schmidt orthonormalization, its first-order approximation as used in Oja's stochastic gradient ascent algorithm, or deflation as in Sanger's generalized Hebbian algorithm. This paper reports the observation that a first-order approximation of Gram-Schmidt orthonormalization is superior to the standard deflation procedure in coupled learning rules. The first-order approximation exhibits a smaller orthonormality error and produces eigenvectors and eigenvalues of better quality. This improvement is essential for applications where multiple principal eigenvectors have to be estimated simultaneously rather than sequentially. Moreover, loss of orthonormality may have an harmful effect on subsequent processing stages, like the computation of distance measures for competition in local PCA methods. (c) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:1582 / 1590
页数:9
相关论文
共 16 条
[1]   PRINCIPAL COMPONENT EXTRACTION USING RECURSIVE LEAST-SQUARES LEARNING [J].
BANNOUR, S ;
AZIMISADJADI, MR .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (02) :457-469
[2]   AN ADAPTIVE LEARNING ALGORITHM FOR PRINCIPAL COMPONENT ANALYSIS [J].
CHEN, LH ;
CHANG, SY .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (05) :1255-1263
[3]  
Diamantaras KI, 1996, Principal Component Neural Networks: Theory and Applications
[4]   CONVERGENCE ANALYSIS OF LOCAL FEATURE-EXTRACTION ALGORITHMS [J].
HORNIK, K ;
KUAN, CM .
NEURAL NETWORKS, 1992, 5 (02) :229-240
[5]  
Miao YF, 1996, IEEE T NEURAL NETWOR, V7, P1052, DOI 10.1109/72.508950
[6]   An extension of neural gas to local PCA [J].
Möller, R ;
Hoffmann, H .
NEUROCOMPUTING, 2004, 62 :305-326
[7]   Coupled principal component analysis [J].
Möller, R ;
Könies, A .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (01) :214-222
[8]   Interlocking of learning and orthonormalization in RRLSA [J].
Möller, R .
NEUROCOMPUTING, 2002, 49 :429-433
[9]   PRINCIPAL COMPONENTS, MINOR COMPONENTS, AND LINEAR NEURAL NETWORKS [J].
OJA, E .
NEURAL NETWORKS, 1992, 5 (06) :927-935
[10]   ON STOCHASTIC-APPROXIMATION OF THE EIGENVECTORS AND EIGENVALUES OF THE EXPECTATION OF A RANDOM MATRIX [J].
OJA, E ;
KARHUNEN, J .
JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 1985, 106 (01) :69-84