Empirical Investigation of the Benefits of Partial Lamarckianism

被引:75
作者
Houck, Christopher R. [1 ]
Joines, Jeffery A. [1 ]
Kay, Michael G. [1 ]
Wilson, James R. [1 ]
机构
[1] N Carolina State Univ, Dept Ind Engn, Raleigh, NC 27695 USA
基金
美国国家科学基金会;
关键词
Lamarckian evolution; Baldwin effect; local improvement procedures; hybrid genetic algorithms;
D O I
10.1162/evco.1997.5.1.31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Genetic algorithms (GAs) are very efficient at exploring the entire search space; however, they are relatively poor at finding the precise local optimal solution in the region in which the algorithm converges. Hybrid GAs are the combination of improvement procedures, which are good at finding local optima, and GAs. There are two basic strategies for using hybrid GAs. In the first, Lamarckian learning, the genetic representation is updated to match the solution found by the improvement procedure. In the second, Baldwinian learning, improvement procedures are used to change the fitness landscape, but the solution that is found is not encoded back into the genetic string. This paper examines the issue of using partial Lamarckianism (i.e., the updating of the genetic representation for only a percentage of the individuals), as compared to pure Lamarckian and pure Baldwinian learning in hybrid GAs. Multiple instances of five bounded nonlinear problems, the location-allocation problem, and the cell formation problem were used as test problems in an empirical investigation. Neither a pure Lamarckian nor a pure Baldwinian search strategy was found to consistently lead to quicker convergence of the GA to the best known solution for the series of test problems. Based on a minimax criterion (i.e., minimizing the worst case performance across all test problem instances), the 20% and 40% partial Lamarckianism search strategies yielded the best mixture of solution quality and computational efficiency.
引用
收藏
页码:31 / 60
页数:30
相关论文
共 31 条
[1]  
[Anonymous], 1975, ADAPT NATUR ARTIFIC
[2]  
[Anonymous], 1991, Handbook of genetic algorithms
[3]  
Chu P., 1995, GENETIC ALGORITHM GE
[4]   TRANSPORTATION-LOCATION PROBLEM [J].
COOPER, L .
OPERATIONS RESEARCH, 1972, 20 (01) :94-&
[5]   MINIMIZING MULTIMODAL FUNCTIONS OF CONTINUOUS-VARIABLES WITH THE SIMULATED ANNEALING ALGORITHM [J].
CORANA, A ;
MARCHESI, M ;
MARTINI, C ;
RIDELLA, S .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1987, 13 (03) :262-280
[6]   T TESTS AND INTERVALS FOR COMPARISONS SUGGESTED BY DATA [J].
DUNCAN, DB .
BIOMETRICS, 1975, 31 (02) :339-359
[7]   STUDY OF POWERS OF SEVERAL METHODS OF MULTIPLE COMPARISONS [J].
EINOT, I ;
GABRIEL, KR .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1975, 70 (351) :574-583
[8]  
Golberg D.E., 1989, Genetic Algorithm in Search, Optimization and Machine Learning
[9]  
GRACE A, 1992, OPTIMIZATION TOOLBOX
[10]   Adding Learning to the Cellular Development of Neural Networks: Evolution and the Baldwin Effect [J].
Gruau, Frederic ;
Whitley, Darrell .
EVOLUTIONARY COMPUTATION, 1993, 1 (03) :213-233