Convergence Properties of Learning in ART1

被引:12
作者
Georgiopoulos, Michael [1 ]
Heileman, Gregory L. [2 ]
Huang, Juxin [1 ]
机构
[1] Univ Cent Florida, Dept Elect Engn, Orlando, FL 32816 USA
[2] Univ New Mexico, Dept Elect & Comp Engn, Albuquerque, NM 87131 USA
关键词
D O I
10.1162/neco.1990.2.4.502
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the ART1 neural network architecture. It is shown that in the fast learning case, an ART1 network that is repeatedly presented with an arbitrary list of binary input patterns, self-stabilizes the recognition code of every size-l pattern in at most l list presentations.
引用
收藏
页码:502 / 509
页数:8
相关论文
共 2 条
[1]   A MASSIVELY PARALLEL ARCHITECTURE FOR A SELF-ORGANIZING NEURAL PATTERN-RECOGNITION MACHINE [J].
CARPENTER, GA ;
GROSSBERG, S .
COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, 1987, 37 (01) :54-115
[2]  
GROSSBERG S, 1976, BIOL CYBERN, V23, P187