Initial Design Strategies and their Effects on Sequential Model-Based Optimization An Exploratory Case Study Based on BBOB

被引:16
作者
Bossek, Jakob [1 ]
Doerr, Carola [2 ]
Kerschke, Pascal [3 ]
机构
[1] Univ Adelaide, Sch Comp Sci, Adelaide, SA, Australia
[2] Sorbonne Univ, CNRS, LIP6, Paris, France
[3] Univ Munster, Informat Syst & Stat, Munster, Germany
来源
GECCO'20: PROCEEDINGS OF THE 2020 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE | 2020年
关键词
Sequential Model-Based Optimization; Design of Experiments; Initial Design; Continuous Black-Box Optimization;
D O I
10.1145/3377930.3390155
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sequential model-based optimization (SMBO) approaches are algorithms for solving problems that require computationally or otherwise expensive function evaluations. The key design principle of SMBO is a substitution of the true objective function by a surrogate, which is used to propose the point(s) to be evaluated next. SMBO algorithms are intrinsically modular, leaving the user with many important design choices. Significant research efforts go into understanding which settings perform best for which type of problems. Most works, however, focus on the choice of the model, the acquisition function, and the strategy used to optimize the latter. The choice of the initial sampling strategy, however, receives much less attention. Not surprisingly, quite diverging recommendations can be found in the literature. We analyze in this work how the size and the distribution of the initial sample influences the overall quality of the efficient global optimization (EGO) algorithm, a well-known SMBO approach. While, overall, small initial budgets using Halton sampling seem preferable, we also observe that the performance landscape is rather unstructured. We furthermore identify several situations in which EGO performs unfavorably against random sampling. Both observations indicate that an adaptive SMBO design could be beneficial, making SMBO an interesting test-bed for automated algorithm design.
引用
收藏
页码:778 / 786
页数:9
相关论文
共 51 条
[1]  
AGOSTON EE, 1999, IEEE T EVOLUTIONARY, V3, P124
[2]  
[Anonymous], 2018, R LANG ENV STAT COMP
[3]  
[Anonymous], 2014, LEARNING INTELLIGENT, DOI DOI 10.1007/978-3-319-01595-8_19
[4]  
[Anonymous], 1967, USSR Computational Mathematics and Mathematical Physics, DOI [10.1016/0041-5553(67)90144-9, DOI 10.1016/0041-5553(67)90144-9]
[5]  
[Anonymous], 2001, J GLOBAL OPTIM
[6]  
[Anonymous], 2010, Digital nets and sequences-Discrepancy theory and quasiMonte Carlo integration
[7]  
Bartz-Beielstein T., 2006, NAT COMP SER
[8]  
BartzBeielstein Thomas, 2010, ABS10064645 CORR
[9]  
Beachkofski Brian, 2002, 3 AIAA ASME ASCE ASC
[10]   Per Instance Algorithm Configuration of CMA-ES with Limited Budget [J].
Belkhir, Nacim ;
Dreo, Johann ;
Saveant, Pierre ;
Schoenauer, Marc .
PROCEEDINGS OF THE 2017 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'17), 2017, :681-688