A novel evolutionary neural learning algorithm

被引:0
作者
Verma, B [1 ]
Ghosh, R [1 ]
机构
[1] Griffith Univ, Sch Informat Technol, Gold Coast, Qld 9726, Australia
来源
CEC'02: PROCEEDINGS OF THE 2002 CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1 AND 2 | 2002年
关键词
evolutionary algorithm; least square method; neural learning algorithm; evolving neural networks;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present a novel genetic algorithm and least square (GALS) based hybrid learning approach for the training of an artificial neural network (ANN). The approach combines evolutionary algorithms with matrix solution methods such as Gram-Schmidt, SVD, etc., to adjust weights for hidden and output layers. Our hybrid method (GALS) incorporates the evolutionary algorithm (EA) in the first layer and the least square method (LS) in the second layer of the ANN. In the proposed approach, a two-layer network is considered, the hidden layer weights are evolved using an evolutionary algorithm and the output layer weights are calculated using a linear least square method. When a certain number of generation or error goal in terms of RMS error is reached, the training is stopped. We start training with a small number of hidden neurons and then the number is increased gradually in an incremental process. The proposed algorithm was implemented and many experiments were conducted on benchmark data sets such as XOR, 10-bit odd parity, handwritten segmented characters recognition, breast cancer diagnosis and heart disease data. The experimental results showed very promising results when compared with other existing evolutionary and error back propagation (EBP) algorithm in classification rate and time complexity.
引用
收藏
页码:1884 / 1889
页数:6
相关论文
共 20 条
[11]   Evolving the topology and the weights of neural networks using a dual representation [J].
Pujol, JCF ;
Poli, R .
APPLIED INTELLIGENCE, 1998, 8 (01) :73-84
[12]   Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation [J].
Sexton, RS ;
Dorsey, RE ;
Johnson, JD .
DECISION SUPPORT SYSTEMS, 1998, 22 (02) :171-185
[13]   Combined biological paradigms: A neural, genetics-based autonomous systems strategy [J].
Smith, RE ;
Cribbs, HB .
ROBOTICS AND AUTONOMOUS SYSTEMS, 1997, 22 (01) :65-74
[14]  
SRINIVAS M, 1991, P 1991 IEEE INT JOIN, V3, P2331
[15]  
TANG KS, 1995, IEE C PUBLICATION, V414, P250
[16]   Fast training of multilayer perceptrons [J].
Verma, B .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (06) :1314-1320
[17]   GENETIC ALGORITHMS AND NEURAL NETWORKS - OPTIMIZING CONNECTIONS AND CONNECTIVITY [J].
WHITLEY, D ;
STARKWEATHER, T ;
BOGART, C .
PARALLEL COMPUTING, 1990, 14 (03) :347-361
[18]  
WHITTLE P, 1963, PREDICTION REGULARIZ
[19]   Evolutionary programming made faster [J].
Yao, X ;
Liu, Y ;
Lin, GM .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 1999, 3 (02) :82-102
[20]   Evolving artificial neural networks [J].
Yao, X .
PROCEEDINGS OF THE IEEE, 1999, 87 (09) :1423-1447