Bare Bones Particle Swarm Optimization With Scale Matrix Adaptation

被引:52
作者
Campos, Mauro [1 ]
Krohling, Renato A. [2 ,3 ]
Enriquez, Ivan [1 ]
机构
[1] Univ Fed Espirito Santo, Dept Stat, BR-29075910 Vitoria, ES, Brazil
[2] Univ Fed Espirito Santo, Dept Prod Engn, BR-29075910 Vitoria, ES, Brazil
[3] Univ Fed Espirito Santo, Grad Program Comp Sci, BR-29075910 Vitoria, ES, Brazil
关键词
Multivariate t-distribution; scale matrix adaptation (SMA); scale mixtures of normal distributions; swarm algorithms; SELF-ADAPTATION; EVOLUTIONARY; ALGORITHM;
D O I
10.1109/TCYB.2013.2290223
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Bare bones particle swarm optimization (BBPSO) is a swarm algorithm that has shown potential for solving single-objective unconstrained optimization problems over continuous search spaces. However, it suffers of the premature convergence problem that means it may get trapped into a local optimum when solving multimodal problems. In order to address this drawback and improve the performance of the BBPSO, we propose a variant of this algorithm, named by us as BBPSO with scale matrix adaptation (SMA), SMA-BBPSO for short reference. In the SMA-BBPSO, the position of a particle is selected from a multivariate t-distribution with a rule for adaptation of its scale matrix. We use the multivariate t-distribution in its hierarchical form, as a scale mixtures of normal distributions. The t-distribution has heavier tails than those of the normal distribution, which increases the ability of the particles to escape from a local optimum. In addition, our approach includes the normal distribution as a particular case. As a consequence, the t-distribution can be applied during the optimization process by maintaining the proper balance between exploration and exploitation. We also propose a simple update rule to adapt the scale matrix associated with a particle. Our strategy consists of adapting the scale matrix of a particle such that the best position found by any particle in its neighborhood is sampled with maximum likelihood in the next iteration. A theoretical analysis was developed to explain how the SMA-BBPSO works, and an empirical study was carried out to evaluate the performance of the proposed algorithm. The experimental results show the suitability of the proposed approach in terms of effectiveness to find good solutions for all benchmark problems investigated. Nonparametric statistical tests indicate that SMA-BBPSO shows a statistically significant improvement compared with other swarm algorithms.
引用
收藏
页码:1567 / 1578
页数:12
相关论文
共 33 条
[1]  
ANDREWS DF, 1974, J ROY STAT SOC B MET, V36, P99
[2]  
[Anonymous], P IEEE C EV COMP SEP
[3]  
[Anonymous], INT J COMPUTER SCI I
[4]  
Arnold DV, 2002, IEEE T EVOLUT COMPUT, V6, P30, DOI [10.1109/4235.985690, 10.1023/A:1015059928466]
[5]  
Beyer HG, 2008, LECT NOTES COMPUT SC, V5199, P123, DOI 10.1007/978-3-540-87700-4_13
[6]   On the Design of Constraint Covariance Matrix Self-Adaptation Evolution Strategies Including a Cardinality Constraint [J].
Beyer, Hans-Georg ;
Finck, Steffen .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2012, 16 (04) :578-596
[7]   A Study of Collapse in Bare Bones Particle Swarm Optimization [J].
Blackwell, Tim .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2012, 16 (03) :354-372
[8]   Defining a standard for particle swarm optimization [J].
Bratton, Daniel ;
Kennedy, James .
2007 IEEE SWARM INTELLIGENCE SYMPOSIUM, 2007, :120-+
[9]   Bayesian inference using Gibbs sampling for Window version (WinBUGS), software for Bayesian analysis using MCMC method and Gibbs sampler [J].
Choy, S. T. Boris ;
Chan, Jennifer S. K. .
AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2008, 50 (02) :135-146
[10]   The particle swarm - Explosion, stability, and convergence in a multidimensional complex space [J].
Clerc, M ;
Kennedy, J .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (01) :58-73