The optimal structure of recurrent neural networks for forecasting

被引:0
作者
Pattamavorakun, S [1 ]
Phien, HN [1 ]
机构
[1] Asian Inst Technol, Klongluang 12120, Pathumthani, Thailand
来源
Proceedings of the IASTED International Conference on Applied Simulation and Modelling | 2004年
关键词
recurrent neural networks; fully recurrent neural network; Elman partially recurrent network; optimal network structure;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A recurrent neural network is a network with feedback (closed loop) connections. The structure of the network seriously affects the performance of the network model. As the number of nodes in the input and output layers are application-dependent, the optimal structure problem reduces to the problem of how to optimally choose the number of hidden nodes in the hidden layer. By combining the Baum-Haussler rule and the Bayesian Information Criterion, we come up with a new rule to obtain the desired structure. By an empirical study, using the rainfall and discharge data to forecast the daily discharges at two important stations, we found that the proposed rule works satisfactory.
引用
收藏
页码:77 / 82
页数:6
相关论文
共 11 条
[1]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[2]  
CHAIRATANATRAI A, 2000, THESIS I TECHNOLOGY
[3]   FINDING STRUCTURE IN TIME [J].
ELMAN, JL .
COGNITIVE SCIENCE, 1990, 14 (02) :179-211
[4]  
LIN JX, 1995, IMMUNITY, V2, P1
[5]  
Nash J.E., 1970, J Hydrol, V10, P282, DOI DOI 10.1016/0022-1694(70)90255-6
[6]  
SAKCHAICHAROEKU.R, 2002, THESIS ASIAN I TECHN
[7]   ESTIMATING DIMENSION OF A MODEL [J].
SCHWARZ, G .
ANNALS OF STATISTICS, 1978, 6 (02) :461-464
[8]  
SUREERATANAN S, 2000, THESIS ASIAN I TECHN
[9]  
SUWARIN P, 2004, SCI US
[10]   A Learning Algorithm for Continually Running Fully Recurrent Neural Networks [J].
Williams, Ronald J. ;
Zipser, David .
NEURAL COMPUTATION, 1989, 1 (02) :270-280