Asymptotic distributions associated to Oja's learning equation for neural networks

被引:3
作者
Delmas, JP [1 ]
Cardoso, JF
机构
[1] Inst Natl Telecommun, F-91011 Evry, France
[2] Ecole Natl Super Telecommun, F-75634 Paris, France
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1998年 / 9卷 / 06期
关键词
adaptive estimation; eigenvectors; Oja's learning equation; principal component analysis; subspace estimation;
D O I
10.1109/72.728373
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we perform a complete asymptotic performance analysis of the stochastic approximation algorithm (denoted subspace network learning algorithm) derived from Oja's learning equation, in the case where the learning rate is constant and a large number of patterns is available. This algorithm drives the connection weight matrix W to an orthonormal basis of a dominant invariant subspace of a covariance matrix. Our approach consists in associating to this algorithm a second stochastic approximation algorithm that governs the evolution of WWT to the projection matrix onto this dominant invariant subspace. Then, using a general result of Gaussian approximation theory, we derive the asymptotic distribution of the estimated projection matrix, Closed form expressions of the asymptotic covariance of the projection matrix estimated by the SNL algorithm, and by the smoothed SNL algorithm that we introduce, are given in case of independent or correlated learning patterns and are further analyzed, It is found that the structures of these asymptotic covariance matrices are similar to those describing batch estimation techniques. The accuracy or our asymptotic analysis is checked by numerical simulations and it is found to be valid not only for a "small" learning rate but in a very large domain. Finally, improvements brought by our smoothed SNL algorithm are shown, such as the learning speed/misadjustment tradeoff and the deviation from orthonormality.
引用
收藏
页码:1246 / 1257
页数:12
相关论文
共 30 条