Dynamic multi-swarm global particle swarm optimization

被引:19
作者
Xia, Xuewen [1 ,2 ]
Tang, Yichao [2 ]
Wei, Bo [2 ]
Zhang, Yinglong [1 ]
Gui, Ling [1 ]
Li, Xiong [2 ]
机构
[1] Minnan Normal Univ, Coll Phys & Informat Engn, Zhangzhou, Peoples R China
[2] East China Jiaotong Univ, Sch Software, Nanchang, Jiangxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Particle swarm optimization; Dynamic multi-swarm strategy; Continuous optimization problems; ALGORITHM; PSO; TIME; ADAPTATION;
D O I
10.1007/s00607-019-00782-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
To satisfy the distinct requirements of different evolutionary stages, a dynamic multi-swarm global particle swarm optimization (DMS-GPSO) is proposed in this paper. In DMS-GPSO, the entire evolutionary process is segmented as an initial stage and a later stage. In the initial stage, the entire population is divided into a global sub-swarm and multiple dynamic multiple sub-swarms. During the evolutionary process, the global sub-swarm focuses on the exploitation under the guidance of the optimal particle in the entire population, while the dynamic multiple sub-swarms pour more attention on the exploration under the guidance of the neighbor's best-so-far position. Moreover, a store operator and a reset operator applied in the global sub-swarm are used to save computational resource and increase the population diversity, respectively. At the later stage, some elite particles stored in an archive are combined with the DMS sub-swarms as a single population to search for optimal solutions, intending to enhance the exploitation ability. The effect of the new introduced strategies is verified by extensive experiments. Besides, the comparison results among DMS-GPSO and other 9 peer algorithms on CEC2013 and CEC2017 test suites demonstrate that DMS-GPSO can effectively avoid the premature convergence when solving multimodal problems, and yield more favorable performance in complex problems.
引用
收藏
页码:1587 / 1626
页数:40
相关论文
共 49 条
[1]   Hybrid Approach Based on GA and PSO for Parameter Estimation of a Full Power Quality Disturbance Parameterized Model [J].
Antonio Rodriguez-Guerrero, Marco ;
Jaen-Cuellar, Arturo Y. ;
Carranza-Lopez-Padilla, Rene D. ;
Osornio-Rios, Roque A. ;
Herrera-Ruiz, Gilberto ;
Romero-Troncoso, Rene de J. .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2018, 14 (03) :1016-1028
[2]  
Awad NH, 2016, TECHNICAL REPORT
[3]   Non-parametric particle swarm optimization for global optimization [J].
Beheshti, Zahra ;
Shamsuddin, Siti Mariyam .
APPLIED SOFT COMPUTING, 2015, 28 :345-359
[4]   Color image segmentation based on multi-level Tsallis-Havrda-Charvat entropy and 2D histogram using PSO algorithms [J].
Borjigin, Surina ;
Sahoo, Prasanna K. .
PATTERN RECOGNITION, 2019, 92 :107-118
[5]   A hybrid particle swarm optimizer with sine cosine acceleration coefficients [J].
Chen, Ke ;
Zhou, Fengyu ;
Yin, Lei ;
Wang, Shuqian ;
Wang, Yugang ;
Wan, Fang .
INFORMATION SCIENCES, 2018, 422 :218-241
[6]   OptiFel: A Convergent Heterogeneous Particle Swarm Optimization Algorithm for Takagi-Sugeno Fuzzy Modeling [J].
Cheung, Ngaam J. ;
Ding, Xue-Ming ;
Shen, Hong-Bin .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2014, 22 (04) :919-933
[7]   Parallel multi-swarm PSO strategies for solving many objective optimization problems [J].
de Campos, Arion, Jr. ;
Pozo, Aurora T. R. ;
Duarte, Elias P., Jr. .
JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2019, 126 :13-33
[8]  
Eberhart R, 1995, MHS 95, P39, DOI [10.1109/MHS.1995.494215, DOI 10.1109/MHS.1995.494215, 10.1109/mhs.1995.494215]
[9]  
Eberhart RC, 2000, IEEE C EVOL COMPUTAT, P84, DOI 10.1109/CEC.2000.870279
[10]   Genetic Learning Particle Swarm Optimization [J].
Gong, Yue-Jiao ;
Li, Jing-Jing ;
Zhou, Yicong ;
Li, Yun ;
Chung, Henry Shu-Hung ;
Shi, Yu-Hui ;
Zhang, Jun .
IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (10) :2277-2290