BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer

被引:176
作者
Khan, Ameer Hamza [1 ]
Cao, Xinwei [2 ]
Li, Shuai [3 ]
Katsikis, Vasilios N. [4 ]
Liao, Liefa [5 ]
机构
[1] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Peoples R China
[2] Shanghai Univ, Sch Management, Shanghai 201900, Peoples R China
[3] Swansea Univ, Dept Elect & Elect Engn, Swansea SA1 8EN, W Glam, Wales
[4] Natl & Kapodistrian Univ Athens, Dept Econ, Div Math & Informat, Athens 10679, Greece
[5] Jiangxi Univ Sci & Technol, Sch Informat Engn, Ganzhou 341000, Peoples R China
关键词
Adaptive moment estimation (ADAM); Beetle antennae search (BAM); gradient estimation; metaheuristic optimization; nature-inspired algorithms; neural network; SINE COSINE ALGORITHM; SYSTEMS; CONTROLLER; MODEL;
D O I
10.1109/JAS.2020.1003048
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose enhancements to Beetle Antennae search (BAS) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation (ADAM) update rule. The proposed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of using the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first propose a gradient estimation model without the need to differentiate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the optimum of non-convex functions. The efficiency of the proposed algorithm was tested on three different benchmark problems, including the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer (PSO) and the original BAS algorithm.
引用
收藏
页码:461 / 471
页数:11
相关论文
共 78 条
  • [1] AN EVOLUTIONARY ALGORITHM THAT CONSTRUCTS RECURRENT NEURAL NETWORKS
    ANGELINE, PJ
    SAUNDERS, GM
    POLLACK, JB
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (01): : 54 - 65
  • [2] [Anonymous], MATLAB GLOB OPT TOOL
  • [3] [Anonymous], IEEE T CYBERN
  • [4] [Anonymous], IEEE T CYBERN
  • [5] [Anonymous], 2017, ARXIV171102395
  • [6] Metaheuristic optimization frameworks: a survey and benchmarking
    Antonio Parejo, Jose
    Ruiz-Cortes, Antonio
    Lozano, Sebastian
    Fernandez, Pablo
    [J]. SOFT COMPUTING, 2012, 16 (03) : 527 - 561
  • [7] Blum C, 2008, STUD COMPUT INTELL, V114, P1, DOI 10.1007/978-3-540-78295-7
  • [8] Optimization Methods for Large-Scale Machine Learning
    Bottou, Leon
    Curtis, Frank E.
    Nocedal, Jorge
    [J]. SIAM REVIEW, 2018, 60 (02) : 223 - 311
  • [9] A Novel Set-Based Particle Swarm Optimization Method for Discrete Optimization Problems
    Chen, Wei-Neng
    Zhang, Jun
    Chung, Henry S. H.
    Zhong, Wen-Liang
    Wu, Wei-Gang
    Shi, Yu-hui
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2010, 14 (02) : 278 - 300
  • [10] A Neural-Network-Based Controller for Piezoelectric-Actuated Stick-Slip Devices
    Cheng, Long
    Liu, Weichuan
    Yang, Chenguang
    Huang, Tingwen
    Hou, Zeng-Guang
    Tan, Min
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2018, 65 (03) : 2598 - 2607