k-Bit Mutation with Self-Adjusting k Outperforms Standard Bit Mutation

被引:32
作者
Doerr, Benjamin [1 ]
Doerr, Carola [2 ,3 ]
Yang, Jing [1 ]
机构
[1] Ecole Polytech, Palaiseau, France
[2] UPMC Univ Paris 06, LIP6, CNRS, Paris, France
[3] UPMC Univ Paris 06, Sorbonne Univ, CNRS, Paris, France
来源
PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XIV | 2016年 / 9921卷
关键词
EVOLUTIONARY ALGORITHMS; SEARCH;
D O I
10.1007/978-3-319-45823-6_77
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When using the classic standard bit mutation operator, parent and offspring differ in a random number of bits, distributed according to a binomial law. This has the advantage that all Hamming distances occur with some positive probability, hence this operator can be used, in principle, for all fitness landscapes. The downside of this "one-size-fitsall" approach, naturally, is a performance loss caused by the fact that often not the ideal number of bits is flipped. Still, the fear of getting stuck in local optima has made standard bit mutation become the preferred mutation operator. In this work we show that a self-adjusting choice of the number of bits to be flipped can both avoid the performance loss of standard bit mutation and avoid the risk of getting stuck in local optima. We propose a simple mechanism to adaptively learn the currently optimal mutation strength from previous iterations. This aims both at exploiting that generally different problems may need different mutation strengths and that for a fixed problem different strengths may become optimal in different stages of the optimization process. We experimentally show that our simple hill climber with this adaptive mutation strength outperforms both the randomized local search heuristic and the (1+1) evolutionary algorithm on the LeadingOnes function and on the minimum spanning tree problem. We show via mathematical means that our algorithm is able to detect precisely (apart from lower order terms) the complicated optimal fitness-dependent mutation strength recently discovered for the OneMax function. With its selfadjusting mutation strength it thus attains the same runtime (apart from o(n) lower-order terms) and the same (asymptotic) 13% fitness-distance improvement over RLS that was recently obtained by manually computing the optimal fitness-dependent mutation strength.
引用
收藏
页码:824 / 834
页数:11
相关论文
共 12 条
[1]  
[Anonymous], 2011, Theory of Randomized Search Heuristics: Foundations and Recent Developments
[2]  
Back T., 1998, Fundamenta Informaticae, V35, P51
[3]  
Böttcher S, 2010, LECT NOTES COMPUT SC, V6238, P1, DOI 10.1007/978-3-642-15844-5_1
[4]  
Doerr B., 2016, GECCO 2016 IN PRESS
[5]   Parameter control in evolutionary algorithms [J].
Eiben, AE ;
Hinterding, R ;
Michalewicz, Z .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 1999, 3 (02) :124-141
[6]   On the analysis of a dynamic evolutionary algorithm [J].
Jansen, Thomas ;
Wegener, Ingo .
JOURNAL OF DISCRETE ALGORITHMS, 2006, 4 (01) :181-199
[7]  
Jansen T, 2014, NAT COMPUT SER, P85, DOI 10.1007/978-3-642-33206-7_5
[8]   Performance analysis of randomised search heuristics operating with a fixed budget [J].
Jansen, Thomas ;
Zarges, Christine .
THEORETICAL COMPUTER SCIENCE, 2014, 545 :39-58
[9]   Parameter Control in Evolutionary Algorithms: Trends and Challenges [J].
Karafotias, Giorgos ;
Hoogendoorn, Mark ;
Eiben, A. E. .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2015, 19 (02) :167-187
[10]   Randomized local search, evolutionary algorithms, and the minimum spanning tree problem [J].
Neumann, Frank ;
Wegener, Ingo .
THEORETICAL COMPUTER SCIENCE, 2007, 378 (01) :32-40