A hybrid differential evolution based on gaining-sharing knowledge algorithm and harris hawks optimization

被引:6
作者
Zhong, Xuxu [1 ]
Duan, Meijun [2 ]
Zhang, Xiao [1 ]
Cheng, Peng [3 ]
机构
[1] Sichuan Univ, Natl Key Lab Fundamental Sci Synthet Vis, Chengdu, Peoples R China
[2] Xihua Univ, Sch Comp & Software Engn, Chengdu, Peoples R China
[3] Sichuan Univ, Sch Aeronaut & Astronaut, Chengdu, Peoples R China
来源
PLOS ONE | 2021年 / 16卷 / 04期
基金
中国国家自然科学基金;
关键词
LEARNING-BASED OPTIMIZATION; GLOBAL OPTIMIZATION; CONTROL PARAMETERS;
D O I
10.1371/journal.pone.0250951
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Differential evolution (DE) is favored by scholars for its simplicity and efficiency, but its ability to balance exploration and exploitation needs to be enhanced. In this paper, a hybrid differential evolution with gaining-sharing knowledge algorithm (GSK) and harris hawks optimization (HHO) is proposed, abbreviated as DEGH. Its main contribution lies are as follows. First, a hybrid mutation operator is constructed in DEGH, in which the two-phase strategy of GSK, the classical mutation operator "rand/1" of DE and the soft besiege rule of HHO are used and improved, forming a double-insurance mechanism for the balance between exploration and exploitation. Second, a novel crossover probability self-adaption strategy is proposed to strengthen the internal relation among mutation, crossover and selection of DE. On this basis, the crossover probability and scaling factor jointly affect the evolution of each individual, thus making the proposed algorithm can better adapt to various optimization problems. In addition, DEGH is compared with eight state-of-the-art DE algorithms on 32 benchmark functions. Experimental results show that the proposed DEGH algorithm is significantly superior to the compared algorithms.
引用
收藏
页数:24
相关论文
共 47 条
  • [1] AARTS EHL, 1991, ALGORITHMICA, V6, P437, DOI 10.1007/BF01759053
  • [2] A novel binary gaining-sharing knowledge-based optimization algorithm for feature selection
    Agrawal, Prachi
    Ganesh, Talari
    Mohamed, Ali Wagdy
    [J]. NEURAL COMPUTING & APPLICATIONS, 2021, 33 (11) : 5989 - 6008
  • [3] Optimizing operating rules for multi-reservoir hydropower generation systems: An adaptive hybrid differential evolution algorithm
    Ahmadianfar, Iman
    Kheyrandish, Ali
    Jamei, Mehdi
    Gharabaghi, Bahram
    [J]. RENEWABLE ENERGY, 2021, 167 : 774 - 790
  • [4] Arnold DV, 2002, IEEE T EVOLUT COMPUT, V6, P30, DOI [10.1109/4235.985690, 10.1023/A:1015059928466]
  • [5] Differential Evolution: A review of more than two decades of research
    Bilal
    Pant, Millie
    Zaheer, Hira
    Garcia-Hernandez, Laura
    Abraham, Ajith
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 90
  • [6] Cheng JC., 2021, SWARM EVOL COMPUT, V61, P100816, DOI [10.1016/j.swevo.2020.100816, DOI 10.1016/J.SWEVO.2020.100816]
  • [7] A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms
    Derrac, Joaquin
    Garcia, Salvador
    Molina, Daniel
    Herrera, Francisco
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2011, 1 (01) : 3 - 18
  • [8] Using differential evolution for fine tuning naive Bayesian classifiers and its application for text classification
    Diab, Diab M.
    El Hindi, Khalil M.
    [J]. APPLIED SOFT COMPUTING, 2017, 54 : 183 - 199
  • [9] A switched parameter differential evolution with optional blending crossover for scalable numerical optimization
    Ghosh, Arka
    Das, Swagatam
    Mullick, Sankha Subhra
    Mallipeddi, Rammohan
    Das, Asit K.
    [J]. APPLIED SOFT COMPUTING, 2017, 57 : 329 - 352
  • [10] Differential evolution improved with self-adaptive control parameters based on simulated annealing
    Guo, Haixiang
    Li, Yanan
    Li, Jinling
    Sun, Han
    Wang, Deyun
    Chen, Xiaohong
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2014, 19 : 52 - 67