Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm

被引:431
作者
Mirjalili, SeyedAli [1 ]
Hashim, Siti Zaiton Mohd [1 ]
Sardroudi, Hossein Moradian [1 ]
机构
[1] Univ Teknol Malaysia, Fac Comp Sci & Informat Syst, Soft Comp Res Grp, Utm Skudai 81310, Johor, Malaysia
关键词
FNN; Neural network; Learning neural network; Gravitational search algorithm; Particle swarm optimization; PSO; Evolutionary algorithm; Multilayer perceptron;
D O I
10.1016/j.amc.2012.04.069
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The Gravitational Search Algorithm (GSA) is a novel heuristic optimization method based on the law of gravity and mass interactions. It has been proven that this algorithm has good ability to search for the global optimum, but it suffers from slow searching speed in the last iterations. This work proposes a hybrid of Particle Swarm Optimization (PSO) and GSA to resolve the aforementioned problem. In this paper, GSA and PSOGSA are employed as new training methods for Feedforward Neural Networks (FNNs) in order to investigate the efficiencies of these algorithms in reducing the problems of trapping in local minima and the slow convergence rate of current evolutionary learning algorithms. The results are compared with a standard PSO-based learning algorithm for FNNs. The resulting accuracy of FNNs trained with PSO, GSA, and PSOGSA is also investigated. The experimental results show that PSOGSA outperforms both PSO and GSA for training FNNs in terms of converging speed and avoiding local minima. It is also proven that an FNN trained with PSOGSA has better accuracy than one trained with GSA. (c) 2012 Elsevier Inc. All rights reserved.
引用
收藏
页码:11125 / 11137
页数:13
相关论文
共 30 条
[1]  
Abarghouei A.A., 2009, Int. J. Adv. Soft Comput. Its Appl, V1, P162
[2]   AN ADAPTIVE CONJUGATE-GRADIENT LEARNING ALGORITHM FOR EFFICIENT TRAINING OF NEURAL NETWORKS [J].
ADELI, H ;
HUNG, SL .
APPLIED MATHEMATICS AND COMPUTATION, 1994, 62 (01) :81-102
[3]  
[Anonymous], S AFRICAN COMPUTER J
[4]  
[Anonymous], ADV INF SCI SOFT COM
[5]  
[Anonymous], 2011, THESIS U TEKNOLOGI M
[6]  
CHANG SK, 1994, IEEE T MAGN, V30, P3644
[7]   ON THE PROBLEM OF LOCAL MINIMA IN BACKPROPAGATION [J].
GORI, M ;
TESI, A .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (01) :76-86
[8]   TRAINING FEEDFORWARD NETWORKS WITH THE MARQUARDT ALGORITHM [J].
HAGAN, MT ;
MENHAJ, MB .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (06) :989-993
[9]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366
[10]   Progress in supervised neural networks [J].
Hush, Don R. ;
Horne, Bill G. .
IEEE SIGNAL PROCESSING MAGAZINE, 1993, 10 (01) :8-39