A dimensionality reduction technique for unconstrained global optimization of functions with low effective dimensionality

被引:6
作者
Cartis, Coralia [1 ,2 ]
Otemissov, Adilet [1 ,2 ]
机构
[1] Univ Oxford, Math Inst, Woodstock Rd, Oxford OX2 6GG, England
[2] British Lib, Alan Turing Inst, London NW1 2DB, England
关键词
global optimization; random matrix theory; dimensionality reduction techniques; functions with low effective dimensionality; PARAMETER;
D O I
10.1093/imaiai/iaab011
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We investigate the unconstrained global optimization of functions with low effective dimensionality, which are constant along certain (unknown) linear subspaces. Extending the technique of random subspace embeddings in Wang et al. (2016, J. Artificial Intelligence Res., 55, 361-387), we study a generic Random Embeddings for Global Optimization (REGO) framework that is compatible with any global minimization algorithm. Instead of the original, potentially large-scale optimization problem, within REGO, a Gaussian random, low-dimensional problem with bound constraints is formulated and solved in a reduced space. We provide novel probabilistic bounds for the success of REGO in solving the original, low effective-dimensionality problem, which show its independence of the (potentially large) ambient dimension and its precise dependence on the dimensions of the effective and embedding subspaces. These results significantly improve existing theoretical analyses by providing the exact distribution of a reduced minimizer and its Euclidean norm and by the general assumptions required on the problem. We validate our theoretical findings by extensive numerical testing of REGO with three types of global optimization solvers, illustrating the improved scalability of REGO compared with the full-dimensional application of the respective solvers.
引用
收藏
页码:167 / 201
页数:35
相关论文
共 50 条
[1]  
[Anonymous], 2003, CRSCTR0311 N CAR STA
[2]  
Ben Salem M., 2019, Sequential dimension reduction for learning features of expensive black-box functions
[3]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[4]  
Bernardo JM., 2000, Bayesian Theory
[5]  
Binois M., 2014, 2 INT C PERS TECHN, P8994
[6]   On the choice of the low-dimensional domain for global optimization via random embeddings [J].
Binois, Mickael ;
Ginsbourger, David ;
Roustant, Olivier .
JOURNAL OF GLOBAL OPTIMIZATION, 2020, 76 (01) :69-90
[7]  
Byrd RH, 2006, NONCONVEX OPTIM, V83, P35
[8]  
Cartis C., 2020, ARXIV200910446
[9]  
Chen Bo., 2012, Proceedings of the 29th International Conference on Maching Learning (ICML-12), P1423
[10]  
Constantine P.G., 2015, ACTIVE SUBSPACES