A learning algorithm for improved recurrent neural networks

被引:0
作者
Chen, CH
Yu, LW
机构
来源
1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4 | 1997年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An improved recurrent neural network structure is proposed. The exact form of a gradient-following learning algorithm for the continuously running neural networks is derived for temporal supervised learning tasks. The algorithm allows networks to learn complex tasks that require the retention of information over time periods. The algorithm also compensates for the information that is missed by the traditional recurrent neural networks. Empirical results show that the networks trained using this algorithm have improved prediction performance over the backpropagation trained network and the Elman recurrent neural network.
引用
收藏
页码:2198 / 2202
页数:5
相关论文
empty
未找到相关数据