Spiral bacterial foraging optimization method: Algorithm, evaluation and convergence analysis

被引:7
作者
Kasaiezadeh, Alireza [1 ]
Khajepour, Amir [1 ]
Waslander, Steven L. [1 ]
机构
[1] Univ Waterloo, Dept Mech & Mechatron Engn, Waterloo, ON N2L 3G1, Canada
关键词
global optimization; multi-modal functions; biologically-inspired algorithm; convergence; stochastic gradient system; DISTRIBUTED OPTIMIZATION; STOCHASTIC ALGORITHMS; LINE SEARCH; GRADIENT; PERTURBATIONS; COMPUTATION; CHEMOTAXIS; BIOMIMICRY;
D O I
10.1080/0305215X.2013.776550
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
A biologically-inspired algorithm called Spiral Bacterial Foraging Optimization (SBFO) is investigated in this article. SBFO, previously proposed by the same authors, is a multi-agent, gradient-based algorithm that minimizes both the main objective function (local cost) and the distance between each agent and a temporary central point (global cost). A random jump is included normal to the connecting line of each agent to the central point, which produces a vortex around the temporary central point. This random jump is also suitable to cope with premature convergence, which is a feature of swarm-based optimization methods. The most important advantages of this algorithm are as follows: First, this algorithm involves a stochastic type of search with a deterministic convergence. Second, as gradient-based methods are employed, faster convergence is demonstrated over GA, DE, BFO, etc. Third, the algorithm can be implemented in a parallel fashion in order to decentralize large-scale computation. Fourth, the algorithm has a limited number of tunable parameters, and finally SBFO has a strong certainty of convergence which is rare in existing global optimization algorithms. A detailed convergence analysis of SBFO for continuously differentiable objective functions has also been investigated in this article.
引用
收藏
页码:439 / 464
页数:26
相关论文
共 34 条
[1]  
[Anonymous], 1990, CLASSICS APPL MATH
[2]  
[Anonymous], ADAPTATION NATURAL A
[3]   Evolutionary gradient search revisited [J].
Arnold, Dirk V. ;
Salomon, Ralf .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2007, 11 (04) :480-495
[4]   Gradient convergence in gradient methods with errors [J].
Bertsekas, DP ;
Tsitsiklis, JN .
SIAM JOURNAL ON OPTIMIZATION, 2000, 10 (03) :627-642
[5]   The ODE method for convergence of stochastic approximation and reinforcement learning [J].
Borkar, VS ;
Meyn, SP .
SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 2000, 38 (02) :447-469
[6]  
Boyd S., 2004, CONVEX OPTIMIZATION, VFirst, DOI DOI 10.1017/CBO9780511804441
[7]   Honey-bees mating optimization (HBMO) algorithm:: A new heuristic approach for water resources optimization [J].
Bozorg-Haddad, Omid ;
Afshar, Abbas ;
Marino, Miguel A. .
WATER RESOURCES MANAGEMENT, 2006, 20 (05) :661-680
[8]   CHEMOTAXIS AND OPTIMIZATION [J].
BREMERMANN, H .
JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 1974, 297 (05) :397-404
[9]   A Multi-Facet Survey on Memetic Computation [J].
Chen, Xianshun ;
Ong, Yew-Soon ;
Lim, Meng-Hiot ;
Tan, Kay Chen .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2011, 15 (05) :591-607
[10]   On Stability of the Chemotactic Dynamics in Bacterial-Foraging Optimization Algorithm [J].
Das, Swagatam ;
Dasgupta, Sambarta ;
Biswas, Arijit ;
Abraham, Ajith ;
Konar, Amit .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 2009, 39 (03) :670-679