Constrained multiobjective optimization of expensive black-box functions using a heuristic branch-and-bound approach

被引:0
作者
Jones, Donald R. [1 ]
Lovison, Alberto [2 ]
机构
[1] Univ Michigan, Dept Aerosp Engn, Ann Arbor, MI 48109 USA
[2] Univ Salento, Dept Math & Phys, Lecce, Italy
关键词
Nonlinear constraints; Multiobjective optimization; Lipschitzian optimization; Black-box; Derivative-free; Deterministic; DIRECT; OBJECTIVE OPTIMIZATION; ALGORITHM;
D O I
10.1007/s10898-023-01336-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
While constrained, multiobjective optimization is generally very difficult, there is a special case in which such problems can be solved with a simple, elegant branch-and-bound algorithm. This special case is when the objective and constraint functions are Lipschitz continuous with known Lipschitz constants. Given these Lipschitz constants, one can compute lower bounds on the functions over subregions of the search space. This allows one to iteratively partition the search space into rectangles, deleting those rectangles which-based on the lower bounds-contain points that are all provably infeasible or provably dominated by previously sampled point(s). As the algorithm proceeds, the rectangles that have not been deleted provide a tight covering of the Pareto set in the input space. Unfortunately, for black-box optimization this elegant algorithm cannot be applied, as we would not know the Lipschitz constants. In this paper, we show how one can heuristically extend this branch-and-bound algorithm to the case when the problem functions are black-box using an approach similar to that used in the well-known DIRECT global optimization algorithm. We call the resulting method "simDIRECT." Initial experience with simDIRECT on test problems suggests that it performs similar to, or better than, multiobjective evolutionary algorithms when solving problems with small numbers of variables (up to 12) and a limited number of runs (up to 600).
引用
收藏
页码:947 / 978
页数:32
相关论文
共 27 条
[1]   Success-History Based Parameter Adaptation in MOEA/D Algorithm [J].
Akhmedova, Shakhnaz ;
Stanovov, Vladimir .
ADVANCES IN SWARM INTELLIGENCE, ICSI 2020, 2020, 12145 :455-462
[2]   Multi objective optimization of computationally expensive multi-modal functions with RBF surrogates and multi-rule selection [J].
Akhtar, Taimoor ;
Shoemaker, Christine A. .
JOURNAL OF GLOBAL OPTIMIZATION, 2016, 64 (01) :17-32
[3]  
Al-Dujaili A, 2016, IEEE C EVOL COMPUTAT, P3606, DOI 10.1109/CEC.2016.7744246
[4]  
Deb K, 2002, IEEE C EVOL COMPUTAT, P825, DOI 10.1109/CEC.2002.1007032
[5]   Surrogate Modeling Approaches for Multiobjective Optimization: Methods, Taxonomy, and Results [J].
Deb, Kalyanmoy ;
Roy, Proteek Chandan ;
Hussein, Rayan .
MATHEMATICAL AND COMPUTATIONAL APPLICATIONS, 2021, 26 (01)
[6]  
Deb K, 2015, SPRINGER HANDBOOK OF COMPUTATIONAL INTELLIGENCE, P995
[7]  
Fan Z, 2017, IEEE C EVOL COMPUTAT, P209, DOI 10.1109/CEC.2017.7969315
[8]  
Fonseca CM, 2006, IEEE C EVOL COMPUTAT, P1142
[9]  
Gablonsky J. M. X., 2001, THESIS N CAROLINA ST
[10]   Methods for multi-objective optimization: An analysis [J].
Giagkiozis, I. ;
Fleming, P. J. .
INFORMATION SCIENCES, 2015, 293 :338-350