A swarm optimizer with attention-based particle sampling and learning for large scale optimization

被引:3
作者
Sheng M. [1 ,4 ]
Wang Z. [2 ,3 ]
Liu W. [3 ]
Wang X. [1 ]
Chen S. [5 ]
Liu X. [3 ]
机构
[1] School of Computer Science and Technology, Zhejiang University of Technology, Hangzhou
[2] College of Electrical Engineering and Automation, Shandong University of Science and Technology, Qingdao
[3] Department of Computer Science, Brunel University London, Middlesex, Uxbridge
[4] Zhejiang Police College, Hangzhou
[5] School of Computer Science and Technology, Tianjin University of Technology, Tianjin
关键词
Attention mechanism; Exemplar selection; Large scale optimization; Particle swarm optimization;
D O I
10.1007/s12652-022-04432-5
中图分类号
学科分类号
摘要
Attention mechanism, which is a cognitive process of selectively concentrating on certain information while ignoring others, has been successfully employed in deep learning. In this paper, we introduce the attention mechanism into a particle swarm optimizer and propose an attention-based particle swarm optimizer (APSO) for large scale optimization. In the proposed method, the attention mechanism is introduced such that activating different particles to participate in evolution at different stages of evolution. Further, an attention-based particle learning is devised to randomly select three particles from a predominant sub-swarm, which is activated by the attention mechanism, to guide the learning of particles. The cooperation of these two strategies could be employed to achieve a balanced evolution search, thus appropriately searching the space of large-scale optimization problems. Extensive experiments have been carried out on CEC’2010 and CEC’2013 large scale optimization benchmark functions to evaluate the performance of proposed method and to compare with related methods. The results show the superiority of proposed method. © 2022, The Author(s).
引用
收藏
页码:9329 / 9341
页数:12
相关论文
共 62 条
  • [31] Mao J., Sun Y., Yi X., Liu H., Ding D., Recursive filtering of networked nonlinear systems: a survey, Int J Syst Sci, 52, 6, pp. 1110-1128, (2021)
  • [32] Maucec M.S., Brest J., Boskovic B., Kacic Z., Improved differential evolution for large-scale black-box optimization, IEEE Access, 6, pp. 29516-29531, (2018)
  • [33] Mendes R., Kennedy J., Neves J., The fully informed particle swarm: simpler, Maybe Better IEEE Trans Evol Comput, 8, 3, pp. 204-210, (2004)
  • [34] Molina D., Lozano M., Herrera F., MA-SW-Chains: Memetic algorithm based on local search chains for large scale continuous global optimization, Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1-8, (2010)
  • [35] Niu B., Zhu Y.L., He X.X., Wu H., MCPSO: a multi-swarm cooperative particle swarm optimizer, Appl Math Comput, 185, 2, pp. 1050-1062, (2007)
  • [36] Omidvar M.N., Li X., Mei Y., Yao X., Cooperative co-evolution with differential grouping for large scale optimization, IEEE Trans Evol Comput, 18, 3, pp. 378-393, (2014)
  • [37] Omidvar M.N., Yang M., Mei Y., Li X., Yao X., DG2: A faster and more accurate differential grouping for largescale black-box optimization, IEEE Trans Evol Comput, 21, 6, pp. 929-942, (2017)
  • [38] Potter M.A., The design and analysis of a computational model of cooperative coevolution. PhD dissertation, Dept. Comput. Sci, George Mason Univ., (1997)
  • [39] Ratnaweera A., Halgamuge S., Watson H., Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients, IEEE Trans Evol Comput, 8, 3, pp. 240-255, (2004)
  • [40] Shelokar P., Siarry P., Jayaraman V.K., Kulkarni B.D., Particle swarm and ant colony algorithms hybridized for improved continuous optimization, Appl Math Comput, 188, 1, pp. 129-142, (2007)