Learning-infused optimization for evolutionary computation

被引:1
作者
Bian, Kun [1 ]
Zhang, Juntao [2 ]
Han, Hong [1 ]
Zhou, Jun [2 ]
Sun, Yifei [3 ]
Cheng, Shi [4 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710071, Peoples R China
[2] AMS, Inst Syst Engn, Beijing 100141, Peoples R China
[3] Shaanxi Normal Univ, Sch Phys & Informat Technol, Xian 710119, Peoples R China
[4] Shaanxi Normal Univ, Sch Comp Sci, Xian 710119, Peoples R China
关键词
Evolutionary computation; Deep learning; Learning-infused optimization; Synthesis patterns; DIFFERENTIAL EVOLUTION; ADAPTATION; ALGORITHM;
D O I
10.1016/j.swevo.2025.101930
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Evolutionary computation is a class of meta-heuristic algorithm that mimics the process of biological evolution, utilizing information exchange among individuals in the population to iteratively search for optimal solutions. During the evolutionary process, a substantial amount of data is generated, from which valuable evolutionary information can be extracted to assist the algorithm to evolve in a more effective direction. Additionally, neural networks excel at extracting knowledge from data. Motivated by this, we propose a learning-infused optimization (LIO) framework that employs neural networks to learn the evolutionary processes of the algorithms and extract synthesis patterns from the valuable evolutionary information. These synthesis patterns possess excellent generalizability and effectiveness, guiding the algorithm towards better solutions on the original problems and enabling transfer evolution ability, which can improve the performance of the algorithm on new problems. The LIO framework is applied to various algorithms. Experimental results demonstrate that the synthesis patterns extracted from the CEC14 problems not only guide the evolution of the algorithms towards better solutions on the original problems, but also significantly improve the performance of the algorithms on the CEC17 problems.
引用
收藏
页数:13
相关论文
共 45 条
[11]   GENETIC ALGORITHMS [J].
HOLLAND, JH .
SCIENTIFIC AMERICAN, 1992, 267 (01) :66-72
[12]  
Hong HK, 2024, Arxiv, DOI [arXiv:2312.06125, 10.48550/arXiv.2312.06125]
[13]  
Huang B., 2024, IEEE Trans. Evol. Comput.
[14]  
Jiang Y., 2023, IEEE Trans. Evol. Comput.
[15]  
Karaboga D., 2005, Tech. Rep., Technical report-tr06
[16]   Machine learning at the service of meta-heuristics for solving combinatorial optimization problems: A state-of-the-art [J].
Karimi-Mamaghan, Maryam ;
Mohammadi, Mehrdad ;
Meyer, Patrick ;
Karimi-Mamaghan, Amir Mohammad ;
Talbi, El-Ghazali .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2022, 296 (02) :393-422
[17]  
Kennedy J, 1995, 1995 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS PROCEEDINGS, VOLS 1-6, P1942, DOI 10.1109/icnn.1995.488968
[18]  
Kumar A, 2017, IEEE C EVOL COMPUTAT, P1835, DOI 10.1109/CEC.2017.7969524
[19]   APSM-jS']jSO: A novel jS']jSO variant with an adaptive parameter selection mechanism and a new external archive updating mechanism [J].
Li, Yintong ;
Han, Tong ;
Zhou, Huan ;
Wei, Yujie ;
Wang, Yuan ;
Tan, Mulai ;
Huang, Changqiang .
SWARM AND EVOLUTIONARY COMPUTATION, 2023, 78
[20]   An improved differential evolution by hybridizing with estimation-of-distribution algorithm [J].
Li, Yintong ;
Han, Tong ;
Tang, Shangqin ;
Huang, Changqiang ;
Zhou, Huan ;
Wang, Yuan .
INFORMATION SCIENCES, 2023, 619 :439-456