A Survey of Automatic Parameter Tuning Methods for Metaheuristics

被引:244
作者
Huang, Changwu [1 ]
Li, Yuanxiang [2 ]
Yao, Xin [1 ,3 ]
机构
[1] Southern Univ Sci & Technol, Dept Comp Sci & Engn, Univ Key Lab Evolving Intelligent Syst Guangdong, Shenzhen Key Lab Computat Intelligence, Shenzhen 518055, Peoples R China
[2] Wuhan Univ, Sch Comp Sci, Wuhan 430072, Peoples R China
[3] Univ Birmingham, Sch Comp Sci, CERCIA, Birmingham B15 2TT, W Midlands, England
基金
英国工程与自然科学研究理事会;
关键词
Tuning; Task analysis; Heuristic algorithms; Optimization; Computer science; Measurement; Systematics; Automatic parameter tuning; metaheuristics; parameter setting; parameter tuning; CMA EVOLUTION STRATEGY; COMBINATORIAL OPTIMIZATION; GLOBAL OPTIMIZATION; ALGORITHM; SEARCH; CONFIGURATION; SELECTION;
D O I
10.1109/TEVC.2019.2921598
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Parameter tuning, that is, to find appropriate parameter settings (or configurations) of algorithms so that their performance is optimized, is an important task in the development and application of metaheuristics. Automating this task, i.e., developing algorithmic procedure to address parameter tuning task, is highly desired and has attracted significant attention from the researchers and practitioners. During last two decades, many automatic parameter tuning approaches have been proposed. This paper presents a comprehensive survey of automatic parameter tuning methods for metaheuristics. A new classification (or taxonomy) of automatic parameter tuning methods is introduced according to the structure of tuning methods. The existing automatic parameter tuning approaches are consequently classified into three categories: 1) simple generate-evaluate methods; 2) iterative generate-evaluate methods; and 3) high-level generate-evaluate methods. Then, these three categories of tuning methods are reviewed in sequence. In addition to the description of each tuning method, its main strengths and weaknesses are discussed, which is helpful for new researchers or practitioners to select appropriate tuning methods to use. Furthermore, some challenges and directions of this field are pointed out for further research.
引用
收藏
页码:201 / 216
页数:16
相关论文
共 105 条
[1]   Fine-tuning of algorithms using fractional experimental designs and local search [J].
Adenso-Díaz, B ;
Laguna, M .
OPERATIONS RESEARCH, 2006, 54 (01) :99-114
[2]  
[Anonymous], 2009, THESIS
[3]  
[Anonymous], 2012, ESTIMATION DISTRIBUT
[4]  
[Anonymous], P GECCO MONTR QC CAN
[5]  
[Anonymous], 2008, P 10 ANN C GEN EV CO
[6]  
[Anonymous], 2011, ALGORITHMS UNPLUGGED
[7]  
[Anonymous], P 19 ANN C DOCT STUD
[8]  
[Anonymous], 2010, thesis
[9]  
[Anonymous], TRIRIDIA200337 U LIB
[10]  
[Anonymous], 2014, HDB COMBINATORIAL OP