μMOSM: A hybrid multi-objective micro evolutionary algorithm

被引:8
作者
Abdi, Yousef [1 ]
Asadpour, Mohammad [1 ]
Seyfari, Yousef [2 ]
机构
[1] Univ Tabriz, Fac Elect & Comp Engn, POB 51666-16471, Tabriz, Iran
[2] Univ Maragheh, Fac Engn, POB 83111-55181, Maragheh, Iran
关键词
Multi-objective optimization; Micro algorithm; Multi-operator; Hybrid evolutionary algorithm; PARTICLE SWARM OPTIMIZATION; DIVERSITY; SELECTION; MOPSO;
D O I
10.1016/j.engappai.2023.107000
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In multi-objective optimization problems (MOPs), several mutually conflicting objectives are optimized simultaneously. In such scenarios, there is not a unique solution to the problem; instead, there is a set of solutions known as the Pareto front, representing the trade-off between objectives. Multi-objective evolutionary algorithms (MOEAs) can approximate these solutions in a single run. However, due to their resource-intensive nature, MOEAs are not suitable for solving real-time and engineering MOPs such as the optimization of manufacturing processes and energy consumption in wireless networks, where a fast convergence rate with less computational cost is required. Fortunately, micro versions of MOEAs can meet this requirement by utilizing a tiny population size. However, this can result in a rapid loss of diversity and the algorithm may easily fall into a local optimum. While some approaches such as the restart technique have been proposed to address this issue, hybrid techniques such as integrative, collaborative, and decomposition-based methods have not been effectively considered in the design of micro algorithms, despite hybridization being a widely accepted method for enhancing the diversity of evolutionary algorithms. In this study, we propose a hybrid micro MOEA called mu MOSM that can effectively tackle the diversity loss problem and accelerate the convergence rate in approximating Pareto front solutions. Experimental results on benchmark test suites and a real-world MOP demonstrate the advantages of our proposed algorithm and confirm that mu MOSM outperforms state-of-the-art MOEAs and micro MOEAs such as MOSM, ADE-MOIA, MMOPSO, NSGA-III, MOEA/D-FRRMAB, mu FAME, and ASMiGA.
引用
收藏
页数:20
相关论文
共 52 条
[1]   On the impact of information-sharing model between subpopulations in the Island-based evolutionary algorithms: search manager framework as a case study [J].
Abdi, Yousef ;
Asadpour, Mohammad .
JOURNAL OF SUPERCOMPUTING, 2023, 79 (13) :14245-14286
[2]   Hybrid multi-objective evolutionary algorithm based on Search Manager framework for big data optimization problems [J].
Abdi, Yousef ;
Feizi-Derakhshi, Mohammad-Reza .
APPLIED SOFT COMPUTING, 2020, 87
[3]  
Abdi Y, 2018, INT J ADV COMPUT SC, V9, P525
[4]   A Fast Multi-Objective Particle Swarm Optimization Algorithm Based on a New Archive Updating Mechanism [J].
Alkebsi, Khalil ;
Du, Wenli .
IEEE ACCESS, 2020, 8 :124734-124754
[5]   SMS-EMOA: Multiobjective selection based on dominated hypervolume [J].
Beume, Nicola ;
Naujoks, Boris ;
Emmerich, Michael .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2007, 181 (03) :1653-1669
[6]   Recent trends in the use of statistical tests for comparing swarm and evolutionary computing algorithms: Practical guidelines and a critical review [J].
Carrasco, J. ;
Garcia, S. ;
Rueda, M. M. ;
Das, S. ;
Herrera, F. .
SWARM AND EVOLUTIONARY COMPUTATION, 2020, 54
[7]  
Coello C.A. C., 2001, Proc. Genetic E vol. Computation Conf. GECCO-, V2001, P274
[8]   Revisiting Pareto-Optimal Multi- and Many-Objective Reference Fronts for Continuous Optimization [J].
da Silva, Gabriela Cavalcante ;
Wanner, Elizabeth F. ;
Bezerra, Leonardo C. T. ;
Stutzle, Thomas .
2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, :1171-1178
[9]  
Deb K, 1999, ARTIFICIAL NEURAL NETS AND GENETIC ALGORITHMS, P235
[10]  
Deb K., 1995, Complex Systems, V9, P115