Success-History Based Parameter Adaptation in MOEA/D Algorithm

被引:4
|
作者
Akhmedova, Shakhnaz [1 ]
Stanovov, Vladimir [1 ]
机构
[1] Reshetnev Siberian State Univ Sci & Technol, Krasnoyarskiy Rabochiy Av 31, Krasnoyarsk 660037, Russia
来源
ADVANCES IN SWARM INTELLIGENCE, ICSI 2020 | 2020年 / 12145卷
关键词
Multi-objective optimization; Differential evolution; Parameter adaptation; Self-adaptation; MOEA/D;
D O I
10.1007/978-3-030-53956-6_41
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper two parameter self-adaptation schemes are proposed for the MOEA/D-DE algorithm. These schemes use the fitness improvement ration to change four parameter values for every individual separately, as long as in the MOEA/D framework every individual solves its own scalar optimization problem. The first proposed scheme samples new values and replaces old values with new ones if there is an improvement, while the second one keeps a set of memory cells and updates the parameter values using the weighted sum. The proposed methods are testes on two sets of benchmark problems, namely MOEADDE functions and WFG functions, IGD and HV metrics are calculated. The results comparison is performed with statistical tests. The comparison shows that the proposed parameter adaptation schemes are capable of delivering significant improvements to the performance of the MOEA/D-DE algorithm. Also, it is shown that parameter tuning is better than random sampling of parameter values. The proposed parameter self-adaptation techniques could be used for other multi-objective algorithms, which use MOEA/D framework.
引用
收藏
页码:455 / 462
页数:8
相关论文
共 50 条
  • [31] Distance Based Parameter Adaptation for Differential Evolution
    Viktorin, Adam
    Senkerik, Roman
    Pluhacek, Michal
    Kadavy, Tomas
    Zamuda, Ales
    2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2017,
  • [32] MOEA/D with chain-based random local search for sparse optimization
    Hui Li
    Jianyong Sun
    Mingyang Wang
    Qingfu Zhang
    Soft Computing, 2018, 22 : 7087 - 7102
  • [33] MOEA/D with chain-based random local search for sparse optimization
    Li, Hui
    Sun, Jianyong
    Wang, Mingyang
    Zhang, Qingfu
    SOFT COMPUTING, 2018, 22 (21) : 7087 - 7102
  • [34] Artificial bee colony algorithm with strategy and parameter adaptation for global optimization
    Bin Zhang
    Tingting Liu
    Changsheng Zhang
    Peng Wang
    Neural Computing and Applications, 2017, 28 : 349 - 364
  • [35] A Study of Distributed MOEA/D Based on Spark Framework
    Zhang, Defu
    Ma, Yingdong
    Chen, Jinxiu
    2017 19TH IEEE INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS (HPCC) / 2017 15TH IEEE INTERNATIONAL CONFERENCE ON SMART CITY (SMARTCITY) / 2017 3RD IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND SYSTEMS (DSS), 2017, : 609 - 610
  • [36] Characteristics of Many-Objective Test Problems and Penalty Parameter Specification in MOEA/D
    Ishibuchi, Hisao
    Doi, Ken
    Nojima, Yusuke
    2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 1115 - 1122
  • [37] Artificial bee colony algorithm with strategy and parameter adaptation for global optimization
    Zhang, Bin
    Liu, Tingting
    Zhang, Changsheng
    Wang, Peng
    NEURAL COMPUTING & APPLICATIONS, 2017, 28 : S349 - S364
  • [38] Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D) for Optimal Design of Hydrogen Supply Chains
    Cantu, Victor H.
    Azzaro-Pantel, Catherine
    Ponsich, Antonin
    30TH EUROPEAN SYMPOSIUM ON COMPUTER AIDED PROCESS ENGINEERING, PTS A-C, 2020, 48 : 883 - 888
  • [39] Using an outward selective pressure for improving the search quality of the MOEA/D algorithm
    Michalak, Krzysztof
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2015, 61 (03) : 571 - 607
  • [40] Parameter Adaptation in Differential Evolution Based on Diversity Control
    Amali, S. Miruna Joe
    Baskar, Subramanian
    SWARM, EVOLUTIONARY, AND MEMETIC COMPUTING, PT I (SEMCCO 2013), 2013, 8297 : 146 - 157