The gradient evolution algorithm: A new metaheuristic

被引:63
作者
Kuo, R. J. [1 ]
Zulvia, Ferani E. [1 ]
机构
[1] Natl Taiwan Univ Sci & Technol, Dept Ind Management, Taipei 106, Taiwan
关键词
Gradient-based method; Metaheuristic method; Optimization; GLOBAL OPTIMIZATION; DIFFERENTIAL EVOLUTION; SEARCH;
D O I
10.1016/j.ins.2015.04.031
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This study presents a new metaheuristic method that is derived from the gradient-based search method. In an exact optimization method, the gradient is used to find extreme points, as well as the optimal point. This study modifies a gradient method, and creates a metaheuristic method that uses a gradient theorem as its basic updating rule. This new method, named gradient evolution, explores the search space using a set of vectors and includes three major operators: vector updating, jumping and refreshing. Vector updating is the main updating rule in gradient evolution. The search direction is determined using the Newton-Raphson method. Vector jumping and refreshing enable this method to avoid local optima. In order to evaluate the performance of the gradient evolution method, three different experiments are conducted, using fifteen test functions. The first experiment determines the influence of parameter settings on the result. It also determines the best parameter setting. There follows a comparison between the basic and improved metaheuristic methods. The experimental results show that gradient evolution performs better than, or as well as, other methods, such as particle swarm optimization, differential evolution, an artificial bee colony and continuous genetic algorithm, for most of the benchmark problems tested. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:246 / 265
页数:20
相关论文
共 49 条
[1]  
Ackley D.H., 1987, A Connectionist Machine for Genetic Hillclimbing, VSECS28
[2]   A modified Artificial Bee Colony algorithm for real-parameter optimization [J].
Akay, Bahriye ;
Karaboga, Dervis .
INFORMATION SCIENCES, 2012, 192 :120-142
[3]  
[Anonymous], 2005, TR06 ERC U
[4]  
[Anonymous], 1952, Methods of conjugate gradients for solving linear systems
[5]   Evolutionary gradient search revisited [J].
Arnold, Dirk V. ;
Salomon, Ralf .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2007, 11 (04) :480-495
[6]   An Overview of Evolutionary Algorithms for Parameter Optimization [J].
Baeck, Thomas ;
Schwefel, Hans-Paul .
EVOLUTIONARY COMPUTATION, 1993, 1 (01) :1-23
[7]   Automatic test case optimization:: A bacteriologic algorithm [J].
Baudry, B ;
Fleurey, F ;
Jézéquel, JM ;
Le Traon, Y .
IEEE SOFTWARE, 2005, 22 (02) :76-+
[8]  
Bazaraa M.S., 1990, LINEAR PROGRAMMING N, DOI DOI 10.1002/0471787779
[9]   Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems [J].
Brest, Janez ;
Greiner, Saso ;
Boskovic, Borko ;
Mernik, Marjan ;
Zumer, Vijern .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2006, 10 (06) :646-657
[10]   A hybrid method combining continuous tabu search and Nelder-Mead simplex algorithms for the global optimization of multiminima functions [J].
Chelouah, R ;
Siarry, P .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2005, 161 (03) :636-654