Constructive learning of recurrent neural networks: Limitations of recurrent cascade correlation and a simple solution - Comment

被引:6
作者
Kremer, SC
机构
[1] Communication Research Centre, Ottawa
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1996年 / 7卷 / 04期
关键词
D O I
10.1109/72.508949
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Giles et al. have proven that Fahlman's recurrent cascade correlation (RCC) architecture is not capable of realizing finite state automata that have state-cycles of length more than two under a constant input signal. This paper extends the conclusions of Giles et al. by showing that there exists a corollary to their original proof which identifies a large second class of automata, that is also unrepresentable by RCC.
引用
收藏
页码:1047 / 1049
页数:3
相关论文
共 4 条
[1]   EFFICIENT SIMULATION OF FINITE AUTOMATA BY NEURAL NETS [J].
ALON, N ;
DEWDNEY, AK ;
OTT, TJ .
JOURNAL OF THE ACM, 1991, 38 (02) :495-514
[2]  
[Anonymous], 1991, Advances in Neural Information Processing Systems
[3]  
KREMER SC, 1996, ADVANCES NEURAL INFO, V8
[4]  
TOMITA M, 1982, 4TH P ANN COGN SCI C, P105