Decomposition in derivative-free optimization

被引:1
作者
Ma, Kaiwen [1 ]
Sahinidis, Nikolaos V. [2 ,3 ]
Rajagopalan, Sreekanth [4 ]
Amaran, Satyajith [4 ]
Bury, Scott J. [5 ]
机构
[1] Carnegie Mellon Univ, Dept Chem Engn, Pittsburgh, PA USA
[2] Georgia Inst Technol, H Milton Stewart Sch Ind & Syst Engn, Atlanta, GA 30332 USA
[3] Georgia Inst Technol, Sch Chem & Biomol Engn, Atlanta, GA 30332 USA
[4] Dow Chem Co USA, Lake Jackson, TX USA
[5] Dow Chem Co USA, Midland, TX USA
关键词
Derivative-free optimization; Superiorization; SNOBFIT; ADAPTIVE DIRECT SEARCH; PARALLEL SPACE DECOMPOSITION; PATTERN SEARCH; PROJECTION METHODS; FREE ALGORITHMS; CONVERGENCE; SUPERIORIZATION; MINIMIZATION; FEASIBILITY; SOFTWARE;
D O I
10.1007/s10898-021-01051-w
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This paper proposes a novel decomposition framework for derivative-free optimization (DFO) algorithms. Our framework significantly extends the scope of current DFO solvers to larger-scale problems. We show that the proposed framework closely relates to the superiorization methodology that is traditionally used for improving the efficiency of feasibility-seeking algorithms for constrained optimization problems in a derivative-based setting. We analyze the convergence behavior of the framework in the context of global search algorithms. A practical implementation is developed and exemplified with the global model-based solver Stable Noisy Optimization by Branch and Fit (SNOBFIT) [36]. To investigate the decomposition framework's performance, we conduct extensive computational studies on a collection of over 300 test problems of varying dimensions and complexity. We observe significant improvements in the quality of solutions for a large fraction of the test problems. Regardless of problem convexity and smoothness, decomposition leads to over 50% improvement in the objective function after 2500 function evaluations for over 90% of our test problems with more than 75 variables.
引用
收藏
页码:269 / 292
页数:24
相关论文
共 56 条
[1]   Convergence of mesh adaptive direct search to second-order stationary points [J].
Abramson, Mark A. ;
Audet, Charles .
SIAM JOURNAL ON OPTIMIZATION, 2006, 17 (02) :606-619
[2]   Mesh adaptive direct search algorithms for constrained optimization [J].
Audet, C ;
Dennis, JE .
SIAM JOURNAL ON OPTIMIZATION, 2006, 17 (01) :188-217
[3]   PARALLEL SPACE DECOMPOSITION OF THE MESH ADAPTIVE DIRECT SEARCH ALGORITHM [J].
Audet, Charles ;
Dennis, J. E., Jr. ;
Le Digabel, Sebastien .
SIAM JOURNAL ON OPTIMIZATION, 2008, 19 (03) :1150-1170
[4]  
Bertsekas D., 1989, Parallel and distributed computation
[5]  
Numerical methods
[6]   Stable Convergence Behavior Under Summable Perturbations of a Class of Projection Methods for Convex Feasibility and Optimization Problems [J].
Butnariu, Dan ;
Davidi, Ran ;
Herman, Gabor T. ;
Kazantsev, Ivan G. .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2007, 1 (04) :540-547
[7]  
Censor Y., 2019, ARXIV PREPRINT ARXIV
[8]   Derivative-free superiorization with component-wise perturbations [J].
Censor, Yair ;
Heaton, Howard ;
Schulte, Reinhard .
NUMERICAL ALGORITHMS, 2019, 80 (04) :1219-1240
[9]   Weak and Strong Superiorization: Between Feasibility-Seeking and Minimization [J].
Censor, Yair .
ANALELE STIINTIFICE ALE UNIVERSITATII OVIDIUS CONSTANTA-SERIA MATEMATICA, 2015, 23 (03) :41-54
[10]   Strict Fejer Monotonicity by Superiorization of Feasibility-Seeking Projection Methods [J].
Censor, Yair ;
Zaslavski, Alexander J. .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2015, 165 (01) :172-187