Surrogate-Assisted Differential Evolution With Adaptive Multisubspace Search for Large-Scale Expensive Optimization

被引:30
作者
Gu, Haoran [1 ,2 ]
Wang, Handing [1 ,2 ]
Jin, Yaochu [3 ,4 ]
机构
[1] Xidian Univ, Sch Artificial Intelligence, Xian 710071, Peoples R China
[2] Xidian Univ, Collaborat Innovat Ctr Quantum Informat Shaanxi Pr, Xian 710071, Peoples R China
[3] Bielefeld Univ, Fac Technol, Nat Inspired Comp & Engn, D-33619 Bielefeld, Germany
[4] Univ Surrey, Dept Comp Sci, Guildford GU2 7XH, England
基金
中国国家自然科学基金;
关键词
Adaptive search switching strategy; large-scale expensive optimization; multisubspace search; radial basis function network (RBFN); surrogate; PARTICLE SWARM OPTIMIZATION; COOPERATIVE COEVOLUTION; ALGORITHM; APPROXIMATION; DESIGN;
D O I
10.1109/TEVC.2022.3226837
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Real-world industrial engineering optimization problems often have a large number of decision variables. Most existing large-scale evolutionary algorithms (EAs) need a large number of function evaluations to achieve high-quality solutions. However, the function evaluations can be computationally intensive for many of these problems, particularly, which makes large-scale expensive optimization challenging. To address this challenge, surrogate-assisted EAs based on the divide-and-conquer strategy have been proposed and shown to be promising. Following this line of research, we propose a surrogate-assisted differential evolution algorithm with adaptive multisubspace search for large-scale expensive optimization to take full advantage of the population and the surrogate mechanism. The proposed algorithm constructs multisubspace based on principal component analysis and random decision variable selection, and searches adaptively in the constructed subspaces with three search strategies. The experimental results on a set of large-scale expensive test problems have demonstrated its superiority over three state-of-the-art algorithms on the optimization problems with up to 1000 decision variables.
引用
收藏
页码:1765 / 1779
页数:15
相关论文
共 48 条
[1]  
Antonov K, 2022, Arxiv, DOI arXiv:2204.13753
[2]  
Back T., 1996, Evolutionary algorithms in theory and practice
[3]   Accelerating evolutionary algorithms with Gaussian process fitness function models [J].
Büche, D ;
Schraudolph, NN ;
Koumoutsakos, P .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2005, 35 (02) :183-194
[4]   A Competitive Swarm Optimizer for Large Scale Optimization [J].
Cheng, Ran ;
Jin, Yaochu .
IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (02) :191-204
[5]   A social learning particle swarm optimization algorithm for scalable optimization [J].
Cheng, Ran ;
Jin, Yaochu .
INFORMATION SCIENCES, 2015, 291 :43-60
[6]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[7]   Differential Evolution: A Survey of the State-of-the-Art [J].
Das, Swagatam ;
Suganthan, Ponnuthurai Nagaratnam .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2011, 15 (01) :4-31
[8]  
Davidor Y., 1990, Complex Systems, V4, P369
[9]   Investigating surrogate-assisted cooperative coevolution for large-Scale global optimization [J].
De Falco, Ivanoe ;
Della Cioppa, Antonio ;
Trunfio, Giuseppe A. .
INFORMATION SCIENCES, 2019, 482 :1-26
[10]   A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms [J].
Derrac, Joaquin ;
Garcia, Salvador ;
Molina, Daniel ;
Herrera, Francisco .
SWARM AND EVOLUTIONARY COMPUTATION, 2011, 1 (01) :3-18