An improved recurrent neural network structure is proposed. The exact form of a gradient-following learning algorithm for the continuously running neural networks is derived for temporal supervised learning tasks. The algorithm allows networks to learn complex tasks that require the retention of information over time periods. The algorithm also compensates for the information that is missed by the traditional recurrent neural networks. Empirical results show that the networks trained using this algorithm have improved prediction performance over the backpropagation trained network and the Elman recurrent neural network.