Offline Data-Driven Optimization at Scale: A Cooperative Coevolutionary Approach

被引:2
作者
Gong, Yue-Jiao [1 ]
Zhong, Yuan-Ting [1 ]
Huang, Hao-Gan [1 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Optimization; Statistics; Sociology; Convergence; Evolutionary computation; Search problems; Mathematical models; Data-driven evolutionary algorithm (DDEA); divide and conquer; large-scale optimization; surrogate model; ASSISTED EVOLUTIONARY ALGORITHM; METAHEURISTICS;
D O I
10.1109/TEVC.2023.3338693
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Data-driven evolutionary algorithms (DDEAs) have received increasing attention during the past decade, but most existing studies are dedicated to solving relatively small-scale problems. For large-scale optimization problems (LSOPs), special efforts must be made to both the surrogate and the evolutionary components to overcome the "curse of dimensionality," which remains a challenge in this research area. To address this research limitation, we propose a novel cooperative coevolution-based DDEA (CC-DDEA). First, a hierarchical surrogate-joint learning model is designed to provide fitness approximations at both global and subdivided spaces, thus being able to guide the evolutionary population searching at different granularities. Then, optimization is conducted on both the global level and local subspace level in the manner of cooperative coevolution. In the local-level search, we introduce a gradient-based operator to accelerate the convergence efficiency of subspaces, owing to the differentiable property of our surrogate model. Additionally, the entire framework is used in conjunction with a progressive and dynamic space division strategy, enabling local parallel-to-global unified search and facilitating the final convergence. Experiments on up to 1000-D problems and the comparisons with state-of-the-art DDEAs validate the powerfulness of the proposed algorithm.
引用
收藏
页码:1809 / 1823
页数:15
相关论文
共 59 条
[1]  
Chandra R, 2011, 2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), P673, DOI 10.1109/IJCNN.2011.6033286
[2]  
Chen CC, 2020, AAAI CONF ARTIF INTE, V34, P3414
[3]   A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications [J].
Chen, Huiming ;
Wu, Ho-Chun ;
Chan, Shing-Chow ;
Lam, Wong-Hing .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (11) :4776-4790
[4]   Multi-Agent Deep Reinforcement Learning for Large-Scale Traffic Signal Control [J].
Chu, Tianshu ;
Wang, Jie ;
Codeca, Lara ;
Li, Zhaojian .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2020, 21 (03) :1086-1095
[5]   A data-driven surrogate-assisted evolutionary algorithm applied to a many-objective blast furnace optimization problem [J].
Chugh, Tinkle ;
Chakraborti, Nirupam ;
Sindhya, Karthik ;
Jin, Yaochu .
MATERIALS AND MANUFACTURING PROCESSES, 2017, 32 (10) :1172-1178
[6]  
Feng Liang, 2023, IEEE Transactions on Artificial Intelligence, P107, DOI [10.1109/tai.2022.3156952, 10.1109/TAI.2022.3156952]
[7]   A Surrogate-Assisted Evolutionary Algorithm with Random Feature Selection for Large-Scale Expensive Problems [J].
Fu, Guoxia ;
Sun, Chaoli ;
Tan, Ying ;
Zhang, Guochen ;
Jin, Yaochu .
PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XVI, PT I, 2020, 12269 :125-139
[8]   Bi-space Interactive Cooperative Coevolutionary algorithm for large scale black-box optimization [J].
Ge, Hongwei ;
Zhao, Mingde ;
Hou, Yaqing ;
Kai, Zhang ;
Sun, Liang ;
Tan, Guozhen ;
Zhang, Qiang ;
Chen, C. L. Philip .
APPLIED SOFT COMPUTING, 2020, 97
[9]  
Guo D., 2020, IEEE C EVOL COMPUTAT, P1, DOI DOI 10.1109/cec48606.2020.9185699
[10]  
Guo D, 2016, PROCEEDINGS OF 2016 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI)