Feedforward Hebbian learning with nonlinear output units: A Lyapunov approach

被引:4
作者
Troyer, TW
机构
[1] W.M. Keck Ctr. Integrative Neurosci., Univ. of California, San Francisco, Box 0444, San Francisco, CA 94143
关键词
Lyapunov functions; correlational learning; unsupervised learning; competitive learning; saturation; high gain sigmoid; principal component analysis;
D O I
10.1016/0893-6080(95)00044-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A Lyapunov function is constructed for the unsupervised learning equations of a large class of neural networks. These networks have a single layer of adjustable connections, units in the output layer are recurrently connected with fixed symmetric weights. The constructed function is similar in form to that derived by Cohen-Grossberg and Hopfield. Two theorems are proved regarding the location of stable equilibria in the limit of high gain transfer functions. The analysis is applied to the soft competitive learning networks of Amari and Takeuchi.
引用
收藏
页码:321 / 328
页数:8
相关论文
共 28 条