A Competitive Divide-and-Conquer Algorithm for Unconstrained Large-Scale Black-Box Optimization

被引:209
作者
Mei, Yi [1 ,3 ]
Omidvar, Mohammad Nabi [1 ]
Li, Xiaodong [1 ]
Yao, Xin [2 ]
机构
[1] RMIT Univ, Sch Comp Sci & Informat Technol, Melbourne, Vic 3000, Australia
[2] Univ Birmingham, CERCIA, Sch Comp Sci, Birmingham B15 2TT, W Midlands, England
[3] Victoria Univ Wellington, Sch Engn & Comp Sci, Kelburn 6012, New Zealand
来源
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE | 2016年 / 42卷 / 02期
基金
英国工程与自然科学研究理事会;
关键词
Large-scale black-box optimization; decomposition; cooperative co-evolution; differential grouping; covariance matrix adaptation evolutionary strategy (CMA-ES); COOPERATIVE COEVOLUTION; SIMPLEX-METHOD; CONVERGENCE; DESCENT;
D O I
10.1145/2791291
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
This article proposes a competitive divide-and-conquer algorithm for solving large-scale black-box optimization problems for which there are thousands of decision variables and the algebraic models of the problems are unavailable. We focus on problems that are partially additively separable, since this type of problem can be further decomposed into a number of smaller independent subproblems. The proposed algorithm addresses two important issues in solving large-scale black-box optimization: (1) the identification of the independent subproblems without explicitly knowing the formula of the objective function and (2) the optimization of the identified black-box subproblems. First, a Global Differential Grouping (GDG) method is proposed to identify the independent subproblems. Then, a variant of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is adopted to solve the subproblems resulting from its rotation invariance property. GDG and CMA-ES work together under the cooperative co-evolution framework. The resultant algorithm, named CC-GDG-CMAES, is then evaluated on the CEC'2010 large-scale global optimization (LSGO) benchmark functions, which have a thousand decision variables and black-box objective functions. The experimental results show that, on most test functions evaluated in this study, GDG manages to obtain an ideal partition of the index set of the decision variables, and CC-GDG-CMAES outperforms the state-of-the-art results. Moreover, the competitive performance of the well-known CMA-ES is extended from low-dimensional to high-dimensional black-box problems.
引用
收藏
页码:1 / 24
页数:24
相关论文
共 51 条
[1]   ADDITIVE INTERACTIVE REGRESSION-MODELS - CIRCUMVENTION OF THE CURSE OF DIMENSIONALITY [J].
ANDREWS, DWK ;
WHANG, YJ .
ECONOMETRIC THEORY, 1990, 6 (04) :466-479
[2]  
[Anonymous], P 2014 IEEE C EV COM
[3]  
[Anonymous], 2002, ESTIMATION DISTRIBUT
[4]  
[Anonymous], 2010, PROC IEEE C EVOL COM
[5]  
[Anonymous], P 1999 IEEE C EV COM
[6]  
[Anonymous], MICRO MACH HUM SCI
[7]  
[Anonymous], J GLOBAL OPTIMIZATIO
[8]   A revised simplex method with integer Q-matrices [J].
Azulay, DO ;
Pique, JF .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2001, 27 (03) :350-360
[9]   LOCAL CONVERGENCE ANALYSIS OF A GROUPED VARIABLE VERSION OF COORDINATE DESCENT [J].
BEZDEK, JC ;
HATHAWAY, RJ ;
HOWARD, RE ;
WILSON, CA ;
WINDHAM, MP .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1987, 54 (03) :471-477
[10]   Block coordinate descent algorithms for large-scale sparse multiclass classification [J].
Blondel, Mathieu ;
Seki, Kazuhiro ;
Uehara, Kuniaki .
MACHINE LEARNING, 2013, 93 (01) :31-52