On the choice of the low-dimensional domain for global optimization via random embeddings

被引:33
作者
Binois, Mickael [1 ]
Ginsbourger, David [2 ,3 ]
Roustant, Olivier [4 ]
机构
[1] Univ Chicago, Booth Sch Business, 5807 S Woodlawn Ave, Chicago, IL 60637 USA
[2] Idiap Res Inst, Uncertainty Quantificat & Optimal Design Grp, Ctr Parc,Rue Marconi 19,POB 592, CH-1920 Martigny, Switzerland
[3] Univ Bern, Dept Math & Stat, IMSV, Alpeneggstr 22, CH-3012 Bern, Switzerland
[4] Ecole Mines St Etienne, UMR CNRS 6158, LIMOS, 158 Cours Fauriel, F-42023 St Etienne, France
基金
美国国家科学基金会;
关键词
Expensive black-box optimization; Low effective dimensionality; Zonotope; REMBO; Bayesian optimization; BAYESIAN OPTIMIZATION; COMPUTER EXPERIMENTS; ALGORITHM; DESIGN;
D O I
10.1007/s10898-019-00839-1
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The challenge of taking many variables into account in optimization problems may be overcome under the hypothesis of low effective dimensionality. Then, the search of solutions can be reduced to the random embedding of a low dimensional space into the original one, resulting in a more manageable optimization problem. Specifically, in the case of time consuming black-box functions and when the budget of evaluations is severely limited, global optimization with random embeddings appears as a sound alternative to random search. Yet, in the case of box constraints on the native variables, defining suitable bounds on a low dimensional domain appears to be complex. Indeed, a small search domain does not guarantee to find a solution even under restrictive hypotheses about the function, while a larger one may slow down convergence dramatically. Here we tackle the issue of low-dimensional domain selection based on a detailed study of the properties of the random embedding, giving insight on the aforementioned difficulties. In particular, we describe a minimal low-dimensional set in correspondence with the embedded search space. We additionally show that an alternative equivalent embedding procedure yields simultaneously a simpler definition of the low-dimensional minimal set and better properties in practice. Finally, the performance and robustness gains of the proposed enhancements for Bayesian optimization are illustrated on numerical examples.
引用
收藏
页码:69 / 90
页数:22
相关论文
共 60 条
  • [1] [Anonymous], 2012, Annales de la Faculte des sciences de Toulouse: Mathematiques, DOI [DOI 10.5802/AFST.1342, 10.5802/afst.1342]
  • [2] [Anonymous], 2013, quadprog: Functions to Solve Quadratic Programming Problems
  • [3] Binois M., 2015, THESIS
  • [4] A Warped Kernel Improving Robustness in Bayesian Optimization Via Random Embeddings
    Binois, Mickael
    Ginsbourger, David
    Roustant, Olivier
    [J]. LEARNING AND INTELLIGENT OPTIMIZATION, LION 9, 2015, 8994 : 281 - 286
  • [5] Carpentier A., 2012, Artificial Intelligence and Statistics, V22, P190
  • [6] Cerny M, 2012, KYBERNETIKA, V48, P890
  • [7] CHEN B, 2012, P INT C MACH LEARN I
  • [8] Chen Y., 2016, ARXIV161103824
  • [9] ACTIVE SUBSPACE METHODS IN THEORY AND PRACTICE: APPLICATIONS TO KRIGING SURFACES
    Constantine, Paul G.
    Dow, Eric
    Wang, Qiqi
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2014, 36 (04) : A1500 - A1524
  • [10] Variable-fidelity modeling of structural analysis of assemblies
    Courrier, Nicolas
    Boucard, Pierre-Alain
    Soulier, Bruno
    [J]. JOURNAL OF GLOBAL OPTIMIZATION, 2016, 64 (03) : 577 - 613