A multi-agent optimization algorithm and its application to training multilayer perceptron models

被引:13
作者
Chauhan, Dikshit [1 ]
Yadav, Anupam [1 ]
Neri, Ferrante [2 ]
机构
[1] Dr BR Ambedkar Natl Inst Technol Jalandhar, Dept Math, Jalandhar 144008, India
[2] Univ Surrey, Dept Comp Sci, NICE Res Grp, Guildford, England
关键词
Meta-heuristic algorithms; Optimization; Artificial intelligence; Neural network; PARTICLE SWARM OPTIMIZATION; ARTIFICIAL NEURAL-NETWORKS; DIFFERENTIAL EVOLUTION; GLOBAL OPTIMIZATION; HYBRID MODEL; BACKPROPAGATION; PSO; CLASSIFICATION; PREDICTION; PARAMETERS;
D O I
10.1007/s12530-023-09518-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The optimal parameter values in a feed-forward neural network model play an important role in determining the efficiency and significance of the trained model. In this paper, we propose an upgraded artificial electric field algorithm (AEFA) for training feed-forward neural network models. This paper also throws some light on the effective use of multi-agent meta-heuristic techniques for the training of neural network models and their future prospects. Seven real-life data sets are used to train neural network models, the results of these trained models show that the proposed scheme performs well in comparison to other training algorithms in terms of high classification accuracy and minimum test error values including gradient-based algorithms and differential evolution variants. Some fundamental modifications in AEFA are also proposed to make it more suitable for training neural networks. All the experimental findings show that the search capabilities and convergence rate of the proposed scheme are better than those of other capable schemes, including gradient-based schemes.
引用
收藏
页码:849 / 879
页数:31
相关论文
共 76 条
  • [1] Battle royale optimizer for training multi-layer perceptron
    Agahian, Saeid
    Akan, Taymaz
    [J]. EVOLVING SYSTEMS, 2022, 13 (04) : 563 - 575
  • [2] Aggarwal CC, 2023, Neural networks and deep learning, DOI [10.1007/978-3-031-29642-013, DOI 10.1007/978-3-031-29642-013]
  • [3] Optimizing connection weights in neural networks using the whale optimization algorithm
    Aljarah, Ibrahim
    Faris, Hossam
    Mirjalili, Seyedali
    [J]. SOFT COMPUTING, 2018, 22 (01) : 1 - 15
  • [4] A novel hybrid multilayer perceptron neural network with improved grey wolf optimizer
    Altay, Osman
    Altay, Elif Varol
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (01) : 529 - 556
  • [5] BACKPROPAGATION AND STOCHASTIC GRADIENT DESCENT METHOD
    AMARI, S
    [J]. NEUROCOMPUTING, 1993, 5 (4-5) : 185 - 196
  • [6] Discrete artificial electric field algorithm for high-order graph matching
    Anita
    Yadav, Anupam
    [J]. APPLIED SOFT COMPUTING, 2020, 92
  • [7] Artificial electric field algorithm for engineering optimization problems
    Anita
    Yadav, Anupam
    Kumar, Nitin
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2020, 149
  • [8] AEFA: Artificial electric field algorithm for global optimization
    Anita
    Yadav, Anupam
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2019, 48 : 93 - 108
  • [9] [Anonymous], 1988, An empirical study of learning speed in back-propagation networks
  • [10] [Anonymous], 1993, INT C ART NEUR NETW