Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers

被引:91
作者
Cartis, Coralia [1 ,3 ]
Fiala, Jan [2 ]
Marteau, Benjamin [2 ]
Roberts, Lindon [1 ,3 ]
机构
[1] Univ Oxford, Math Inst, Oxford, England
[2] Numer Algorithms Grp, Wilkinson House,Jordan Hill Rd, Oxford OX2 8DR, England
[3] Univ Oxford, Math Inst, Radcliffe Observ Quarter, Woodstock Rd, Oxford OX2 6GG, England
来源
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE | 2019年 / 45卷 / 03期
基金
英国工程与自然科学研究理事会;
关键词
Derivative-free optimization; least-squares; trust region methods; stochastic optimization; mathematical software; performance evaluation; ALGORITHMS; GEOMETRY;
D O I
10.1145/3338517
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We present two software packages for derivative-free optimization (DFO): DFO-LS for nonlinear least-squares problems and Py-BOBYQA for general objectives, both with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals and allows flexible initialization for expensive problems, whereby it can begin making progress after as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, regression-based model construction, and multiple restart strategies with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over sampling and regression techniques. The package Py-BOBYQA is a Python implementation of BOBYQA (Powell 2009), with novel features such as the implementation of robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce an adaptive accuracy measure for data profiles of noisy functions, striking a balance between measuring the true and the noisy objective improvement.
引用
收藏
页数:41
相关论文
共 38 条
[1]  
[Anonymous], ARXIV14031931
[2]  
[Anonymous], ARXIV170304156
[3]  
[Anonymous], 2009, 2009NA06 DAMTP U CAM
[4]  
Audet C., 2017, DERIVATIVE FREE BLAC, V11, P6330
[5]   Best practices for comparing optimization algorithms [J].
Beiranvand, Vahid ;
Hare, Warren ;
Lucet, Yves .
OPTIMIZATION AND ENGINEERING, 2017, 18 (04) :815-848
[6]   DERIVATIVE-FREE OPTIMIZATION OF EXPENSIVE FUNCTIONS WITH COMPUTATIONAL ERROR USING WEIGHTED REGRESSION [J].
Billups, Stephen C. ;
Larson, Jeffrey ;
Graf, Peter .
SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (01) :27-53
[7]  
Cartis C, 2018, TECHNICAL REPORT
[8]  
Cartis C., 2019, MATH PROG COMP
[9]   Stochastic optimization using a trust-region method and random models [J].
Chen, R. ;
Menickelly, M. ;
Scheinberg, K. .
MATHEMATICAL PROGRAMMING, 2018, 169 (02) :447-487
[10]  
Conn A. R., 2000, SIAM, DOI [10.1137/1.9780898719857, DOI 10.1137/1.9780898719857]