Learning Adaptive Subspace Differential Evolution by Evolution Strategies

被引:1
作者
Zhang, Haotian [1 ]
Shi, Jialong [2 ]
Sun, Jianyong [2 ]
Xu, Zongben [2 ]
机构
[1] Xi An Jiao Tong Univ, Frontier Inst Sci & Technol, Xian, Peoples R China
[2] Xi An Jiao Tong Univ, Sch Math & Stat, Xian, Peoples R China
来源
2024 6TH INTERNATIONAL CONFERENCE ON DATA-DRIVEN OPTIMIZATION OF COMPLEX SYSTEMS, DOCS 2024 | 2024年
基金
中国国家自然科学基金;
关键词
Differential evolution; Parameter control; Learning to optimize; OPTIMIZATION;
D O I
10.1109/DOCS63458.2024.10704542
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The subspace method is one of the most used optimization methods, which means optimizing not all elements of the decision variable but only part of it. Subspace methods can obtain a faster convergence rate on objective functions with special structures, such as sparse objective functions. The core idea of the subspace method is selecting the proper subspace (i.e. elements) to be optimized in each iteration, which is always selected by hand. Recently the idea has been borrowed in some studies of evolutionary algorithms, however, the subspace-selecting mechanism is still handcrafted. In our opinion, the selecting mechanism can be learned from optimizing related optimization problems. In this paper, we propose an adaptive subspace differential evolution algorithm, whose subspace is selected by a graph neural network (GNN). The GNN is trained by Natural Evolution Strategies. The experiments on the CEC 2018 test suite show that the proposed method can perform significantly better than classic differential evolution and the proposed method can generalize on problems with different dimensions.
引用
收藏
页码:85 / 95
页数:11
相关论文
共 47 条
[1]  
Andrychowicz M, 2016, ADV NEUR IN, V29
[2]  
[Anonymous], 1996, Evolution strategies, evolutionary programing, genetic algorithms
[3]   Differential Evolution: A Survey of the State-of-the-Art [J].
Das, Swagatam ;
Suganthan, Ponnuthurai Nagaratnam .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2011, 15 (01) :4-31
[4]  
Defferrard M, 2016, ADV NEUR IN, V29
[5]   Compressed sensing [J].
Donoho, DL .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (04) :1289-1306
[6]  
Eberhart R., 1995, P 6 INT S MICR HUM S, P39, DOI DOI 10.1109/MHS.1995.494215
[7]  
Gasse M, 2019, ADV NEUR IN, V32
[8]  
Goldberg D.E., 1989, GENETIC ALGORITHMS S, P41
[9]   Proximal Gradient Methods with Adaptive Subspace Sampling [J].
Grishchenko, Dmitry ;
Iutzeler, Franck ;
Malick, Jerome .
MATHEMATICS OF OPERATIONS RESEARCH, 2021, 46 (04) :1303-1323
[10]   Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES) [J].
Hansen, N ;
Muller, SD ;
Koumoutsakos, P .
EVOLUTIONARY COMPUTATION, 2003, 11 (01) :1-18