Analysing hyper-heuristics based on Neural Networks for the automatic design of population-based metaheuristics in continuous optimisation problems

被引:1
作者
Tapia-Avitia, Jose M. [1 ]
Cruz-Duarte, Jorge M. [1 ]
Amaya, Ivan [1 ]
Ortiz-Bayliss, Jose Carlos [1 ]
Terashima-Marin, Hugo [1 ]
Pillay, Nelishia [2 ]
机构
[1] Tecnol Monterrey, Sch Engn & Sci, Av Eugenio Garza Sada 2501 Sur, Monterrey 64700, Nuevo Leon, Mexico
[2] Univ Pretoria, Dept Comp Sci, Lynnwood Rd, ZA-0083 Pretoria, South Africa
关键词
Parameter tuning and algorithm configuration; Metaheuristics; Neural Networks; Performance measures; Hyper-heuristics; DECADE;
D O I
10.1016/j.swevo.2024.101616
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
When dealing with optimisation problems, Metaheuristics (MHs) quickly come to our minds. A quick literature review reveals a vast universe of MHs. Although the metaphors behind these MHs are always presented as 'unique' to justify their novelty, the truth is that many MHs just recombine elements from other techniques. Then, instead of proposing MHs based on what already exists in nature, it is better to follow a standard model for the automatic metaheuristic design by employing simple heuristics. Many approaches have designed algorithms that probe the combination of such heuristics, generating astonishing results compared to generic MHs. Following this idea, our work examines Neural Network (NN) architectures over several control variables to tailor MHs. Our results render an architecture that enhances the results compared to generic MHs at 91%, those MHs produced by Random Search at 81%, and the current state-of-the-art NN model at 66%. We notice a big gap for NN-based models with different architectures, which are worth investigating. Among the benefits of our proposed approach is that it reduces the dependence on human knowledge, moving towards the automatic generation of solving methods that learn from empirical data how to succeed in various continuous optimisation scenarios.
引用
收藏
页数:22
相关论文
共 85 条
  • [1] Abadi M., 2015, Large-Scale Machine Learning on Heterogeneous Systems
  • [2] Brain tumor detection from 3D MRI using Hyper-Layer Convolutional Neural Networks and Hyper-Heuristic Extreme Learning Machine
    Alnaggar, Omar Abdullah Murshed Farhan
    Jagadale, Basavaraj N.
    Narayan, Swaroopa H.
    Saif, Mufeed Ahmed Naji
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2022, 34 (24)
  • [3] Enhancing Selection Hyper-Heuristics via Feature Transformations
    Amaya, Ivan
    Ortiz-Bayliss, Jose C.
    Rosales-Perez, Alejandro
    Gutierrez-Rodriguez, Andres E.
    Conant-Pablos, Santiago E.
    Terashima-Marin, Hugo
    Coello Coello, Carlos A.
    [J]. IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2018, 13 (02) : 30 - 41
  • [4] [Anonymous], 2015, 15 UK WORKSH COMP IN
  • [5] Metaphor-based metaheuristics, a call for action: the elephant in the room
    Aranha, Claus
    Villalon, Christian L. Camacho
    Campelo, Felipe
    Dorigo, Marco
    Ruiz, Ruben
    Sevaux, Marc
    Sorensen, Kenneth
    Stutzle, Thomas
    [J]. SWARM INTELLIGENCE, 2022, 16 (01) : 1 - 6
  • [6] Aswanandini R., 2021, INDIAN J SCI TECHNOL, V14, P2934, DOI DOI 10.17485/IJST/v14i38.1401
  • [7] Burke E. K., 2019, Handbook of Metaheuristics, P453, DOI [DOI 10.1007/978-3-319-91086-414, DOI 10.1007/978-3-319-91086-4_14]
  • [8] Campelo F., 2021, CEUR WORKSHOP P, P3007
  • [9] A Neuro-evolutionary Hyper-heuristic Approach for Constraint Satisfaction Problems
    Carlos Ortiz-Bayliss, Jose
    Terashima-Marin, Hugo
    Enrique Conant-Pablos, Santiago
    [J]. COGNITIVE COMPUTATION, 2016, 8 (03) : 429 - 441
  • [10] Recent trends in the use of statistical tests for comparing swarm and evolutionary computing algorithms: Practical guidelines and a critical review
    Carrasco, J.
    Garcia, S.
    Rueda, M. M.
    Das, S.
    Herrera, F.
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2020, 54 (54)