SF-FWA: A Self-Adaptive Fast Fireworks Algorithm for effective large-scale optimization

被引:107
作者
Chen, Maiyue [1 ,2 ]
Tan, Ying [1 ,2 ,3 ]
机构
[1] Peking Univ, Sch Intelligence Sci & Technol, Beijing 100871, Peoples R China
[2] Peking Univ, Key Lab Machine Percept MOE, Beijing 100871, Peoples R China
[3] Peking Univ, Inst Artificial Intelligence, Beijing 100871, Peoples R China
基金
中国国家自然科学基金;
关键词
Black-box optimization; Large-scale optimization; Fireworks algorithm; Continuous control; Reinforcement learning; COOPERATIVE COEVOLUTION; EVOLUTION STRATEGY; BFGS; CMA;
D O I
10.1016/j.swevo.2023.101314
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Computationally efficient algorithms for large-scale black-box optimization have become increasingly impor-tant in recent years due to the growing complexity of engineering and scientific problems. In this paper, a novel algorithm called the Self-adaptive Fast Fireworks Algorithm (SF-FWA) is proposed to effectively conduct large-scale black-box optimization. The main idea is to utilize a set of expressive and computationally efficient search distributions to cope with different function landscapes while tuning the hyperparameters of the search distributions in an online fashion. To achieve this, the Expressive Fast Explosion (EFE) mechanism is designed to achieve effective and efficient sampling, and the Inter-Fireworks Competitive Cooperation (IFCC) mechanism is designed to adapt hyperparameter distributions. This new optimization paradigm equips the population with the ability to automatically adjust to a rich set of function landscapes with linear computational complexity in terms of problem dimensionality. Experimental studies show that SF-FWA can not only exploit the separability of the problem efficiently but can also deal with rotational transformations to the coordinate system. The numerical results on the standard large-scale optimization benchmark suite indicate that SF-FWA outperforms current state-of-the-art large-scale optimization algorithms. The outstanding performance of SF-FWA on optimizing neural network controllers for solving reinforcement learning tasks demonstrates its great potential to be applied to a wider range of real-world problems.
引用
收藏
页数:16
相关论文
共 80 条
[1]   Aquila Optimizer: A novel meta-heuristic optimization algorithm [J].
Abualigah, Laith ;
Yousri, Dalia ;
Abd Elaziz, Mohamed ;
Ewees, Ahmed A. ;
Al-qaness, Mohammed A. A. ;
Gandomi, Amir H. .
COMPUTERS & INDUSTRIAL ENGINEERING, 2021, 157 (157)
[2]   The Arithmetic Optimization Algorithm [J].
Abualigah, Laith ;
Diabat, Ali ;
Mirjalili, Seyedali ;
Elaziz, Mohamed Abd ;
Gandomi, Amir H. .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 376
[3]   Gazelle optimization algorithm: a novel nature-inspired metaheuristic optimizer [J].
Agushaka, Jeffrey O. ;
Ezugwu, Absalom E. ;
Abualigah, Laith .
NEURAL COMPUTING & APPLICATIONS, 2023, 35 (05) :4099-4131
[4]   Dwarf Mongoose Optimization Algorithm [J].
Agushaka, Jeffrey O. ;
Ezugwu, Absalom E. ;
Abualigah, Laith .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2022, 391
[5]  
Ai LF, 2011, LECT NOTES COMPUT SC, V7063, P258, DOI 10.1007/978-3-642-24958-7_30
[6]   The cheetah optimizer: a nature-inspired metaheuristic algorithm for large-scale optimization problems [J].
Akbari, Mohammad Amin ;
Zare, Mohsen ;
Azizipanah-abarghooee, Rasoul ;
Mirjalili, Seyedali ;
Deriche, Mohamed .
SCIENTIFIC REPORTS, 2022, 12 (01)
[7]   Diagonal Acceleration for Covariance Matrix Adaptation Evolution Strategies [J].
Akimoto, Y. ;
Hansen, N. .
EVOLUTIONARY COMPUTATION, 2020, 28 (03) :405-435
[8]   Comparison-Based Natural Gradient Optimization in High Dimension [J].
Akimoto, Youhei ;
Auger, Anne ;
Hansen, Nikolaus .
GECCO'14: PROCEEDINGS OF THE 2014 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2014, :373-380
[9]  
[Anonymous], 2013, P 15 ANN C COMP ASS
[10]   Simplify Your Covariance Matrix Adaptation Evolution Strategy [J].
Beyer, Hans-Georg ;
Sendhoff, Bernhard .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2017, 21 (05) :746-759