Particle swarm optimization with an enhanced learning strategy and crossover operator

被引:61
作者
Molaei, Sajjad [1 ]
Moazen, Hadi [2 ]
Najjar-Ghabel, Samad [1 ]
Farzinvash, Leili [1 ]
机构
[1] Univ Tabriz, Fac Elect & Comp Engn, Tabriz, Iran
[2] Sharif Univ Technol, Dept Comp Engn, Tehran, Iran
关键词
Particle swarm optimization; Swarm intelligence; Optimization; Enhanced learning strategy; Parameter updating; Crossover operator; FEATURE-SELECTION; ALGORITHM; WEIGHT; SEARCH; COLONY;
D O I
10.1016/j.knosys.2021.106768
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Particle Swarm Optimization (PSO) is a well-known swarm intelligence (SI) algorithm employed for solving various optimization problems. This algorithm suffers from premature convergence to local optima. Accordingly, a number of PSO variants have been proposed in the literature. These algorithms exploited different schemes to improve performance. In this paper, we propose a new variant of PSO with an enhanced Learning strategy and Crossover operator (PSOLC). This algorithm applies three strategies, comprising altering the exemplar particles, updating the PSO parameters, and integrating PSO with Genetic Algorithm (GA). In the proposed learning strategy, each particle is guided by the best positions (pbests) of all particles, which improves its search capability. Furthermore, the proposed parameter updating scheme computes the self-cognition coefficient for each particle based on the quality of the pbests. Finally, the proposed crossover operator injects randomness to particles to improve the global search ability. The proposed improvements in PSOLC increase its exploration capability at the early stages of the search process and its exploitation ability at the end. The derived outcome from applying PSOLC and other variants of PSO to the benchmark functions verify the superiority of the proposed algorithm in terms of accuracy and convergence speed. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:21
相关论文
共 55 条
[1]   Adaptive acceleration coefficients for a new search diversification strategy in particle swarm optimization algorithms [J].
Ardizzon, Guido ;
Cavazzini, Giovanna ;
Pavesi, Giorgio .
INFORMATION SCIENCES, 2015, 299 :337-378
[2]  
Awad N.H., 2017, PROBLEM DEFINITIONS
[3]   A survey on optimization metaheuristics [J].
Boussaid, Ilhern ;
Lepagnot, Julien ;
Siarry, Patrick .
INFORMATION SCIENCES, 2013, 237 :82-117
[4]   FZ enabled Multi-objective PSO for multicasting in IoT based Wireless Sensor Networks [J].
Chaudhry, Rashmi ;
Tapaswi, Shashikala ;
Kumar, Neetesh .
INFORMATION SCIENCES, 2019, 498 :1-20
[5]   Chaotic dynamic weight particle swarm optimization for numerical function optimization [J].
Chen, Ke ;
Zhou, Fengyu ;
Liu, Aling .
KNOWLEDGE-BASED SYSTEMS, 2018, 139 :23-40
[6]   A hybrid particle swarm optimizer with sine cosine acceleration coefficients [J].
Chen, Ke ;
Zhou, Fengyu ;
Yin, Lei ;
Wang, Shuqian ;
Wang, Yugang ;
Wan, Fang .
INFORMATION SCIENCES, 2018, 422 :218-241
[7]   Dynamic multi-swarm differential learning particle swarm optimizer [J].
Chen, Yonggang ;
Li, Lixiang ;
Peng, Haipeng ;
Xiao, Jinghua ;
Wu, Qingtao .
SWARM AND EVOLUTIONARY COMPUTATION, 2018, 39 :209-221
[8]   Particle swarm optimizer with crossover operation [J].
Chen, Yonggang ;
Li, Lixiang ;
Xiao, Jinghua ;
Yang, Yixian ;
Liang, Jun ;
Li, Tao .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2018, 70 :159-169
[9]   Particle swarm optimizer with two differential mutation [J].
Chen, Yonggang ;
Li, Lixiang ;
Peng, Haipeng ;
Xiao, Jinghua ;
Yang, Yixian ;
Shi, Yuhui .
APPLIED SOFT COMPUTING, 2017, 61 :314-330
[10]  
De G., 1989, Genetic Algorithms in Search, Optimization, and Machine Learning