Multi-objective Parameter Tuning with Dynamic Compositional Surrogate Models

被引:1
作者
Pukhkaiev, Dmytro [1 ]
Husak, Oleksandr [1 ]
Gotz, Sebastian [1 ]
Assmann, Uwe [1 ]
机构
[1] Tech Univ Dresden, Software Technol Grp, Dresden, Germany
来源
LEARNING AND INTELLIGENT OPTIMIZATION, LION 15 | 2021年 / 12931卷
关键词
Parameter tuning; Hyperparameter optimization; Multi-objective optimization; Surrogate models; EVOLUTIONARY ALGORITHMS; OPTIMIZATION;
D O I
10.1007/978-3-030-92121-7_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-objective parameter tuning is a highly-practical black-box optimization problem, in which the target system is expensive to evaluate. To identify well-performing solutions within the limited budget, a substitution of the target system with a surrogate model, its cheap-to-evaluate approximation, introduces immense benefits. Some surrogates may be more successful for particular objective functions, other at certain stages of optimization. Alas, most state-of-the-art approaches do not address this issue, requiring either to be selected at design time; or lack granularity, changing all models for all objective functions simultaneously. In this paper we provide an approach allowing to individually assign surrogate models to different objective functions and to dynamically combine them into multi-objective compositional surrogate models. To ensure a high prediction quality, our approach contains a model validation strategy based on the cross-validation principle. Moreover, we unite multiple compositional surrogates within a portfolio to even further increase the quality of the search process. Finally, we use the proposed validation strategy to enable a dynamic sampling plan, allowing to get high-quality solutions with even fewer evaluations. The evaluation with a WFG benchmark suite for multi-objective optimization showed that our approach outperforms existing multi-objective model-based approaches.
引用
收藏
页码:333 / 350
页数:18
相关论文
共 39 条
[1]  
Akhtar T, 2019, Arxiv, DOI [arXiv:1903.02167, 10.48550/arXiv.1903.02167, DOI 10.48550/ARXIV.1903.02167]
[2]  
Azzouz R, 2017, ADAPT LEARN OPTIM, V20, P31, DOI 10.1007/978-3-319-42978-6_2
[3]   ACTGAN: Automatic Configuration Tuning for Software Systems with Generative Adversarial Networks [J].
Bao, Liang ;
Liu, Xin ;
Wang, Fangzheng ;
Fang, Baoyin .
34TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING (ASE 2019), 2019, :465-476
[4]  
Bartz-Beielstein T, 2005, IEEE C EVOL COMPUTAT, P773
[5]  
Bechikh S, 2017, ADAPT LEARN OPTIM, V20, P105, DOI 10.1007/978-3-319-42978-6_4
[6]   Consciousness is not a property of states: A reply to Wilberg [J].
Berger, Jacob .
PHILOSOPHICAL PSYCHOLOGY, 2014, 27 (06) :829-842
[7]  
Biscani Francesco, 2020, Zenodo, DOI 10.5281/ZENODO.3738182
[8]  
Bischl B, 2018, Arxiv, DOI [arXiv:1703.03373, 10.48550/arXiv.1703.03373]
[9]  
Falkner S, 2018, PR MACH LEARN RES, V80
[10]   Integrated parameter and tolerance design based on a multivariate Gaussian process model [J].
Feng, Zebiao ;
Wang, Jianjun ;
Ma, Yan ;
Ma, Yizhong .
ENGINEERING OPTIMIZATION, 2021, 53 (08) :1349-1368