Research progress of hardness theories on evolutionary algorithm

被引:0
|
作者
机构
[1] College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing
[2] School of Information Engineering, Nanchang Hangkong University, Nanchang
来源
Li, M. (limingniat@hotmail.com) | 1600年 / Chinese Institute of Electronics卷 / 42期
关键词
Epistasis measure; Fitness landscape; Landscape state machine; Optimal contraction theorem; Optimization hardness; Spatial correlation;
D O I
10.3969/j.issn.0372-2112.2014.02.026
中图分类号
学科分类号
摘要
The research work of hardness on evolutionary algorithm is an important branch of evolutionary computation. It aims to study the relationship between the performance of evolutionary algorithm and the characters of the optimization problem. And, the goal of the hardness research is estimating the performance of an evolutionary algorithm deployed in one optimization problem by using limited information. This paper summarizes six kinds of hardness theories on evolutionary algorithm, such as fitness-distance correlation model, fitness landscape methods, landscape state machine, optimal contraction theorem, epistasis methods, etc. and eight hardness indicators with them. Furthermore, this paper discusses the advantages and disadvantages of these methods, and prospects the trends of this research field.
引用
收藏
页码:383 / 390
页数:7
相关论文
共 45 条
  • [1] Chen T., Et al., Analysis of computational time of simple estimation of distribution algorithms, IEEE Transactions on Evolutionary Computation, 14, 1, pp. 1-22, (2010)
  • [2] He X.-J., Zeng J.-C., Wang L.-F., An estimation of distribution algorithm based on information transmission, Acta Electronica Sinica, 39, 4, pp. 967-970, (2011)
  • [3] Kwoh H., Et al., Feasibility structure modeling: An effective chaperone for constrained memetic algorithms, IEEE Transactions on Evolutionary Computation, 14, 5, pp. 740-758, (2010)
  • [4] Mernik H., Et al., A memetic grammar inference algorithm for language learning, Applied Soft Computing, 12, 3, pp. 1006-1020, (2012)
  • [5] Gao F., Cui G., Et al., A novel multi-step position-selectable updating particle swarm optimization algorithm, Acta Electronica Sinica, 37, 3, pp. 529-534, (2009)
  • [6] Wu X.-J., Li F., Et al., The convergence analysis of the uniform search particle swarm optimization, Acta Electronica Sinica, 40, 6, pp. 1115-1120, (2012)
  • [7] Macready W., No free lunch theorems for optimization, IEEE Transactions on Evolutionary Computation, 1, 1, pp. 67-82, (1997)
  • [8] Kallel N., A comparison of predictive measures of problem difficulty in evolutionary algorithms, IEEE Transactions on Evolutionary Computation, 4, 1, pp. 1-15, (2000)
  • [9] He R., Et al., A discussion on posterior and prior mesaures of problem difficulties, PPSN IX Workshop on Eovlutionary Algorithm-Bridging Theory and Practice, pp. 1-13, (2006)
  • [10] Forrest J., Fitness distance correlation as a measure of problem difficulty for genetic algorithms, Proceedings of the 6th International Conference on Genetic Algorithms, pp. 184-192, (1995)