Memetic algorithms for training feedforward neural networks: an approach based on gravitational search algorithm

被引:21
作者
Garcia-Rodenas, Ricardo [1 ]
Jimenez Linares, Luis [2 ]
Alberto Lopez-Gomez, Julio [2 ]
机构
[1] Univ Castilla La Mancha, Dept Math, Ciudad Real, Spain
[2] Univ Castilla La Mancha, Dept Informat Technol & Syst, Ciudad Real, Spain
关键词
Feedforward neural networks; Memetic algorithms; Gravitational search algorithm; Quasi-Newton methods; PARTICLE SWARM OPTIMIZATION; COMPUTATIONAL INTELLIGENCE; EVOLUTIONARY ALGORITHMS; TRAVEL MODE; CLASSIFIERS; DESIGN; CLASSIFICATION; PARAMETERS; GSA; PERFORMANCE;
D O I
10.1007/s00521-020-05131-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The backpropagation (BP) algorithm is a gradient-based algorithm used for training a feedforward neural network (FNN). Despite the fact that BP is still used today when FNNs are trained, it has some disadvantages, including the following: (i) it fails when non-differentiable functions are addressed, (ii) it can become trapped in local minima, and (iii) it has slow convergence. In order to solve some of these problems, metaheuristic algorithms have been used to train FNN. Although they have good exploration skills, they are not as good as gradient-based algorithms at exploitation tasks. The main contribution of this article lies in its application of novel memetic approaches based on the Gravitational Search Algorithm (GSA) and Chaotic Gravitational Search Algorithm (CGSA) algorithms, called respectively Memetic Gravitational Search Algorithm (MGSA) and Memetic Chaotic Gravitational Search Algorithm (MCGSA), to train FNNs in three classical benchmark problems: the XOR problem, the approximation of a continuous function, and classification tasks. The results show that both approaches constitute suitable alternatives for training FNNs, even improving on the performance of other state-of-the-art metaheuristic algorithms such as ParticleSwarm Optimization (PSO), the Genetic Algorithm (GA), the Adaptive Differential Evolution algorithm with Repaired crossover rate (Rcr-JADE), and the Covariance matrix learning and Bimodal distribution parameter setting Differential Evolution (COBIDE) algorithm. Swarm optimization, the genetic algorithm, the adaptive differential evolution algorithm with repaired crossover rate, and the covariance matrix learning and bimodal distribution parameter setting differential evolution algorithm.
引用
收藏
页码:2561 / 2588
页数:28
相关论文
共 76 条