Gradient-based optimizer: A new metaheuristic optimization algorithm

被引:643
作者
Ahmadianfar, Iman [1 ]
Bozorg-Haddad, Omid [2 ]
Chu, Xuefeng [3 ]
机构
[1] Behbahan Khatam Alanbia Univ Technol, Dept Civil Engn, Behbahan, Iran
[2] Univ Tehran, Coll Agr & Nat Resources, Fac Agr Engn & Technol, Dept Irrigat & Reclamat Engn, Tehran, Iran
[3] North Dakota State Univ, Dept Civil & Environm Engn, Dept 2470, Fargo, ND USA
关键词
Optimization; Gradient-based method; Metaheuristic algorithm; Constrained optimization problem; PARTICLE SWARM OPTIMIZATION; ATOM SEARCH OPTIMIZATION; DIFFERENTIAL EVOLUTION; DESIGN OPTIMIZATION; GSA;
D O I
10.1016/j.ins.2020.06.037
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this study, a novel metaheuristic optimization algorithm, gradient-based optimizer (GBO) is proposed. The GBO, inspired by the gradient-based Newton's method, uses two main operators: gradient search rule (GSR) and local escaping operator (LEO) and a set of vectors to explore the search space. The GSR employs the gradient-based method to enhance the exploration tendency and accelerate the convergence rate to achieve better positions in the search space. The LEO enables the proposed GBO to escape from local optima. The performance of the new algorithm was evaluated in two phases. 28 mathematical test functions were first used to evaluate various characteristics of the GBO, and then six engineering problems were optimized by the GBO. In the first phase, the GBO was compared with five existing optimization algorithms, indicating that the GBO yielded very promising results due to its enhanced capabilities of exploration, exploitation, convergence, and effective avoidance of local optima. The second phase also demonstrated the superior performance of the GBO in solving complex real-world engineering problems. Source codes of the GBO algorithm are publicly available at http://imanahmadianfar.com/codes/. (c) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页码:131 / 159
页数:29
相关论文
共 51 条
[1]   Developing optimal policies for reservoir systems using a multi-strategy optimization algorithm [J].
Ahmadianfar, Iman ;
Khajeh, Zahra ;
Asghari-Pari, Seyed-Amin ;
Chu, Xuefeng .
APPLIED SOFT COMPUTING, 2019, 80 :888-903
[2]  
[Anonymous], 1988, SIMULATED ANNEALING
[3]  
[Anonymous], 1952, Methods of conjugate gradients for solving linear systems
[4]   A Lamarckian Hybrid of Differential Evolution and Conjugate Gradients for Neural Network Training [J].
Bandurski, Krzysztof ;
Kwedlo, Wojciech .
NEURAL PROCESSING LETTERS, 2010, 32 (01) :31-44
[5]  
Bazaraa M.S., 2013, NONLINEAR PROGRAMMIN
[6]   A new optimization meta-heuristic algorithm based on self-defense mechanism of the plants with three reproduction operators [J].
Caraveo, Camilo ;
Valdez, Fevrier ;
Castillo, Oscar .
SOFT COMPUTING, 2018, 22 (15) :4907-4920
[7]   A high-speed interval type 2 fuzzy system approach for dynamic parameter adaptation in metaheuristics [J].
Castillo, Oscar ;
Melin, Patricia ;
Ontiveros, Emanuel ;
Peraza, Cinthia ;
Ochoa, Patricia ;
Valdez, Fevrier ;
Soria, Jose .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2019, 85 :666-680
[8]   Symbiotic Organisms Search: A new metaheuristic optimization algorithm [J].
Cheng, Min-Yuan ;
Prayogo, Doddy .
COMPUTERS & STRUCTURES, 2014, 139 :98-112
[9]  
Chickermane H, 1996, INT J NUMER METH ENG, V39, P829, DOI 10.1002/(SICI)1097-0207(19960315)39:5<829::AID-NME884>3.0.CO
[10]  
2-U