Phasor particle swarm optimization: a simple and efficient variant of PSO

被引:210
作者
Ghasemi, Mojtaba [1 ]
Akbari, Ebrahim [2 ]
Rahimnejad, Abolfazl [3 ]
Razavi, Seyed Ehsan [4 ]
Ghavidel, Sahand [5 ]
Li, Li [5 ]
机构
[1] Shiraz Univ Technol, Dept Elect & Elect Engn, Shiraz, Iran
[2] Univ Isfahan, Dept Elect Engn, Fac Engn, Esfahan, Iran
[3] Univ Guelph, Sch Engn, Guelph, ON, Canada
[4] Univ Birjand, Fac Elect & Comp Engn, Birjand, Iran
[5] Univ Technol Sydney, Fac Engn & Informat Technol, Sydney, NSW, Australia
关键词
Particle swarm phasor optimization; Phase angle; Large-scale optimization; GLOBAL OPTIMIZATION; DIFFERENTIAL EVOLUTION; ALGORITHM; SCALE; VELOCITY; STRATEGY; COLONY; MEMORY;
D O I
10.1007/s00500-018-3536-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Particle swarm optimizer is a well-known efficient population and control parameter-based algorithm for global optimization of different problems. This paper focuses on a new and primary sample for PSO, which is named phasor particle swarm optimization (PPSO) and is based on modeling the particle control parameters with a phase angle (theta), inspired from phasor theory in the mathematics. This phase angle (theta) converts PSO algorithm to a self-adaptive, trigonometric, balanced, and nonparametric meta-heuristic algorithm. The performance of PPSO is tested on real-parameter optimization problems including unimodal and multimodal standard test functions and traditional benchmark functions. The optimization results show good and efficient performance of PPSO algorithm in real-parameter global optimization, especially for high-dimensional optimization problems compared with other improved PSO algorithms taken from the literature. The phasor model can be used to expand different types of PSO and other algorithms. The source codes of the PPSO algorithms are publicly available at .
引用
收藏
页码:9701 / 9718
页数:18
相关论文
共 81 条
[1]   Rough particle swarm optimization and its applications in data mining [J].
Alatas, Bilal ;
Akin, Erhan .
SOFT COMPUTING, 2008, 12 (12) :1205-1218
[2]   Chaos embedded particle swarm optimization algorithms [J].
Alatas, Bilal ;
Akin, Erhan ;
Ozer, A. Bedri .
CHAOS SOLITONS & FRACTALS, 2009, 40 (04) :1715-1734
[3]   Optimizing connection weights in neural networks using the whale optimization algorithm [J].
Aljarah, Ibrahim ;
Faris, Hossam ;
Mirjalili, Seyedali .
SOFT COMPUTING, 2018, 22 (01) :1-15
[4]  
[Anonymous], 2010, APPL SOFT COMPUT
[5]  
[Anonymous], 2011, IEEE T EVOLUT COMPUT
[6]   Adaptive acceleration coefficients for a new search diversification strategy in particle swarm optimization algorithms [J].
Ardizzon, Guido ;
Cavazzini, Giovanna ;
Pavesi, Giorgio .
INFORMATION SCIENCES, 2015, 299 :337-378
[7]   A new and improved version of particle swarm optimization algorithm with global-local best parameters [J].
Arumugam, M. Senthil ;
Rao, M. V. C. ;
Chandramohan, Aarthi .
KNOWLEDGE AND INFORMATION SYSTEMS, 2008, 16 (03) :331-357
[8]   Analysis of Stability, Local Convergence, and Transformation Sensitivity of a Variant of the Particle Swarm Optimization Algorithm [J].
Bonyadi, Mohammad Reza ;
Michalewicz, Zbigniew .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2016, 20 (03) :370-385
[9]   An analysis of the velocity updating rule of the particle swarm optimization algorithm [J].
Bonyadi, Mohammad Reza ;
Michalewicz, Zbigniew ;
Li, Xiaodong .
JOURNAL OF HEURISTICS, 2014, 20 (04) :417-452
[10]   Dynamic analysis for the selection of parameters and initial population, in particle swarm optimization [J].
Campana, Emilio F. ;
Fasano, Giovanni ;
Pinto, Antonio .
JOURNAL OF GLOBAL OPTIMIZATION, 2010, 48 (03) :347-397