Exponential distribution optimizer (EDO): a novel math-inspired algorithm for global optimization and engineering problems

被引:127
作者
Abdel-Basset, Mohamed [1 ]
El-Shahat, Doaa [1 ]
Jameel, Mohammed [2 ]
Abouhawwash, Mohamed [3 ,4 ]
机构
[1] Zagazig Univ, Fac Comp & Informat, Zagazig 44519, Ash Sharqia Gov, Egypt
[2] Sanaa Univ, Fac Sci, Dept Math, 13509, Sanaa, Yemen
[3] Mansoura Univ, Fac Sci, Dept Math, Mansoura 35516, Egypt
[4] Michigan State Univ, Dept Computat Math Sci & Engn CMSE, E Lansing, MI 48824 USA
关键词
Swarm intelligence; Exponential distribution optimizer algorithm; Memoryless property; Stochastic; Engineering design problem; PARTICLE SWARM OPTIMIZATION; COOPERATIVE COEVOLUTIONARY ALGORITHM; META-HEURISTIC OPTIMIZATION; GENETIC ALGORITHM; DIFFERENTIAL EVOLUTION; COMPETITIVE ALGORITHM; DESIGN; HYBRID; COLONY; SYSTEM;
D O I
10.1007/s10462-023-10403-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Numerous optimization problems can be addressed using metaheuristics instead of deterministic and heuristic approaches. This study proposes a novel population-based metaheuristic algorithm called the Exponential Distribution Optimizer (EDO). The main inspiration for EDO comes from mathematics based on the exponential probability distribution model. At the outset, we initialize a population of random solutions representing multiple exponential distribution models. The positions in each solution represent the exponential random variables. The proposed algorithm includes two methodologies for exploitation and exploration strategies. For the exploitation stage, the algorithm utilizes three main concepts, memoryless property, guiding solution and the exponential variance among the exponential random variables to update the current solutions. To simulate the memoryless property, we assume that the original population contains only the winners that obtain good fitness. We construct another matrix known as memoryless to retain the newly generated solutions regardless of their fitness compared to their corresponding winners in the original population. As a result, the memoryless matrix stores two types of solutions: winners and losers. According to the memoryless property, we disregard and do not memorize the previous history of these solutions because past failures are independent and have no influence on the future. The losers can thus contribute to updating the new solutions next time. We select two solutions from the original population derived from the exponential distributions to update the new solution throughout the exploration phase. Furthermore, EDO is tested against classical test functions in addition to the Congress on Evolutionary Computation (CEC) 2014, CEC 2017, CEC 2020 and CEC 2022 benchmarks, as well as six engineering design problems. EDO is compared with the winners of CEC 2014, CEC 2017 and CEC 2020, which are L-SHADE, LSHADE-cnEpSin and AGSK, respectively. EDO reveals exciting results and can be a robust tool for CEC competitions. Statistical analysis demonstrates the superiority of the proposed EDO at a 95% confidence interval.
引用
收藏
页码:9329 / 9400
页数:72
相关论文
共 142 条
[1]   Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems [J].
Abdollahzadeh, Benyamin ;
Gharehchopogh, Farhad Soleimanian ;
Mirjalili, Seyedali .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2021, 36 (10) :5887-5958
[2]   African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems [J].
Abdollahzadeh, Benyamin ;
Gharehchopogh, Farhad Soleimanian ;
Mirjalili, Seyedali .
COMPUTERS & INDUSTRIAL ENGINEERING, 2021, 158
[3]   Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer [J].
Abualigah, Laith ;
Abd Elaziz, Mohamed ;
Sumari, Putra ;
Geem, Zong Woo ;
Gandomi, Amir H. .
EXPERT SYSTEMS WITH APPLICATIONS, 2022, 191
[4]   Aquila Optimizer: A novel meta-heuristic optimization algorithm [J].
Abualigah, Laith ;
Yousri, Dalia ;
Abd Elaziz, Mohamed ;
Ewees, Ahmed A. ;
Al-qaness, Mohammed A. A. ;
Gandomi, Amir H. .
COMPUTERS & INDUSTRIAL ENGINEERING, 2021, 157 (157)
[5]   The Arithmetic Optimization Algorithm [J].
Abualigah, Laith ;
Diabat, Ali ;
Mirjalili, Seyedali ;
Elaziz, Mohamed Abd ;
Gandomi, Amir H. .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2021, 376
[6]   Dwarf Mongoose Optimization Algorithm [J].
Agushaka, Jeffrey O. ;
Ezugwu, Absalom E. ;
Abualigah, Laith .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2022, 391
[7]   RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method [J].
Ahmadianfar, Iman ;
Heidari, Ali Asghar ;
Gandomi, Amir H. ;
Chu, Xuefeng ;
Chen, Huiling .
EXPERT SYSTEMS WITH APPLICATIONS, 2021, 181
[8]   Gradient-based optimizer: A new metaheuristic optimization algorithm [J].
Ahmadianfar, Iman ;
Bozorg-Haddad, Omid ;
Chu, Xuefeng .
INFORMATION SCIENCES, 2020, 540 :131-159
[9]   The monarch butterfly optimization algorithm for solving feature selection problems [J].
Alweshah, Mohammed ;
Al Khalaileh, Saleh ;
Gupta, Brij B. ;
Almomani, Ammar ;
Hammouri, Abdelaziz, I ;
Al-Betar, Mohammed Azmi .
NEURAL COMPUTING & APPLICATIONS, 2022, 34 (14) :11267-11281
[10]   Heap-based optimizer inspired by corporate rank hierarchy for global optimization [J].
Askari, Qamar ;
Saeed, Mehreen ;
Younas, Irfan .
EXPERT SYSTEMS WITH APPLICATIONS, 2020, 161