A two-stage accelerated search strategy for large-scale multi-objective evolutionary algorithm

被引:0
|
作者
Cui, Zhihua [1 ]
Wu, Yijing [1 ]
Zhao, Tianhao [1 ]
Zhang, Wensheng [2 ]
Chen, Jinjun [3 ]
机构
[1] Taiyuan Univ Sci & Technol, Shanxi Key Lab Big Data Anal & Parallel Comp, Taiyuan 030024, Shanxi, Peoples R China
[2] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[3] Swinburne Univ Technol, Dept Comp Technol, Melbourne, Vic, Australia
基金
中国国家自然科学基金;
关键词
Evolutionary algorithm; Large-scale multi-objective optimization; algorithm; Opposite learning; Artificial neural network; Accelerated search; OPTIMIZATION ALGORITHM;
D O I
10.1016/j.ins.2024.121347
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Since large-scale multi-objective problems (LSMOPs) have huge decision variables, the traditional evolutionary algorithms are facing difficulties of low exploitation efficiency and high exploration costs in solving LSMOPs. Therefore, this paper proposes an evolutionary strategy based on two- stage accelerated search optimizers (ATAES). Specifically, a convergence optimizer is devised in the first stage, while a three-layer lightweight convolutional neural network model is built, and the population is homogenized into two subsets, the diversity subset, and the convergence subset, which serve as input nodes and the expected output nodes of the neural network, respectively. Then, by constantly backpropagating the gradient, a satisfactory individual will be produced. Once exploitation stagnation is discovered in the first phase, the second phase will be run, where a diversity optimizer using a differential optimization algorithm with opposite learning is suggested to increase the exploration range of candidate solutions and thereby increase the population's diversity. Finally, to validate the algorithm's performance, on multi-objective LSMOP and DTLZ benchmark suits with decision variable quantities of 100, 300, 500, and 1000, the ATAES demonstrated its superiority with other advanced multi-objective evolutionary algorithms.
引用
收藏
页数:23
相关论文
共 50 条
  • [31] A multi-stage knowledge-guided evolutionary algorithm for large-scale sparse multi-objective optimization problems *
    Ding, Zhuanlian
    Chen, Lei
    Sun, Dengdi
    Zhang, Xingyi
    SWARM AND EVOLUTIONARY COMPUTATION, 2022, 73
  • [32] A flexible two-stage constrained multi-objective evolutionary algorithm based on automatic regulation
    Zou, Juan
    Luo, Jian
    Liu, Yuan
    Yang, Shengxiang
    Zheng, Jinhua
    INFORMATION SCIENCES, 2023, 634 : 227 - 243
  • [33] Two-stage evolutionary algorithm with fuzzy preference indicator for multimodal multi-objective optimization
    Xie, Yinghong
    Li, Junhua
    Li, Yufei
    Zhu, Wenhao
    Dai, Chaoqing
    SWARM AND EVOLUTIONARY COMPUTATION, 2024, 85
  • [34] A two-stage preference driven multi-objective evolutionary algorithm for workflow scheduling in the Cloud
    Xie, Huamao
    Ding, Ding
    Zhao, Lihong
    Kang, Kaixuan
    Liu, Qiaofeng
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [35] A two-stage evolutionary algorithm based on three indicators for constrained multi-objective optimization
    Dong, Jun
    Gong, Wenyin
    Ming, Fei
    Wang, Ling
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 195
  • [36] Two-stage sparse multi-objective evolutionary algorithm for channel selection optimization in BCIs
    Liu, Tianyu
    Wu, Yu
    Ye, An
    Cao, Lei
    Cao, Yongnian
    FRONTIERS IN HUMAN NEUROSCIENCE, 2024, 18
  • [37] An inverse model-guided two-stage evolutionary algorithm for multi-objective optimization
    Shen, Jiangtao
    Dong, Huachao
    Wang, Peng
    Li, Jinglu
    Wang, Wenxin
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 225
  • [38] A new two-stage based evolutionary algorithm for solving multi-objective optimization problems
    Wang, Yiming
    Gao, Weifeng
    Gong, Maoguo
    Li, Hong
    Xie, Jin
    INFORMATION SCIENCES, 2022, 611 : 649 - 659
  • [39] Competition-based two-stage evolutionary algorithm for constrained multi-objective optimization
    Hao, Lupeng
    Peng, Weihang
    Liu, Junhua
    Zhang, Wei
    Li, Yuan
    Qin, Kaixuan
    MATHEMATICS AND COMPUTERS IN SIMULATION, 2025, 230 : 207 - 226
  • [40] A two-stage evolutionary algorithm for large-scale sparse multiobjective optimization problems
    Jiang, Jing
    Han, Fei
    Wang, Jie
    Ling, Qinghua
    Han, Henry
    Wang, Yue
    SWARM AND EVOLUTIONARY COMPUTATION, 2022, 72