A new parallel galactic swarm optimization algorithm for training artificial neural networks

被引:4
|
作者
Bhardwaj, Shubham [1 ]
Amali, Geraldine Bessie D. [2 ]
Phadke, Amrut [2 ]
Umadevi, K. S. [2 ]
Balakrishnan, P. [2 ]
机构
[1] Reliance Jio, Hyderabad, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Vellore, Tamil Nadu, India
关键词
nature inspired metaheuristic; parallel computation; galactic swarm optimization; artificial neural networks;
D O I
10.3233/JIFS-179747
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Metaheuristic algorithms are a family of algorithms that help solve NP-hard problems by providing near-optimal solutions in a reasonable amount of time. Galactic Swarm Optimization (GSO) is the state-of-the-art metaheuristic algorithm that takes inspiration from the motion of stars and galaxies under the influence of gravity. In this paper, a new scalable algorithm is proposed to help overcome the inherent sequential nature of GSO and helps the modified version of the GSO algorithm to utilize the full computing capacity of the hardware efficiently. The modified algorithm includes new features to tackle the problem of training an Artificial Neural Network. The proposed algorithm is compared with Stochastic Gradient Descent based on performance and accuracy. The algorithm's performance was evaluated based on per-CPU utilization on multiple platforms. Experimental results have shown that PGSO outperforms GSO and other competitors like PSO in a variety of challenging settings.
引用
收藏
页码:6691 / 6701
页数:11
相关论文
共 50 条
  • [21] Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm
    Mirjalili, SeyedAli
    Hashim, Siti Zaiton Mohd
    Sardroudi, Hossein Moradian
    APPLIED MATHEMATICS AND COMPUTATION, 2012, 218 (22) : 11125 - 11137
  • [22] An Improved Quantum-Behaved Particle Swarm Optimization Algorithm for Training Fuzzy Neural Networks
    Chiang, Cheng-Hsiung
    2013 INTERNATIONAL CONFERENCE ON FUZZY THEORY AND ITS APPLICATIONS (IFUZZY 2013), 2013, : 358 - 363
  • [23] An improved butterfly optimization algorithm for training the feed-forward artificial neural networks
    Büşra Irmak
    Murat Karakoyun
    Şaban Gülcü
    Soft Computing, 2023, 27 : 3887 - 3905
  • [24] MUSSELS WANDERING OPTIMIZATION ALGORITHM BASED TRAINING OF ARTIFICIAL NEURAL NETWORKS FOR PATTERN CLASSIFICATION
    Abusnaina, Ahmed A.
    Abdullah, Rosni
    COMPUTING & INFORMATICS, 4TH INTERNATIONAL CONFERENCE, 2013, 2013, : 78 - 85
  • [25] An improved butterfly optimization algorithm for training the feed-forward artificial neural networks
    Irmak, Busra
    Karakoyun, Murat
    Gulcu, Saban
    SOFT COMPUTING, 2023, 27 (07) : 3887 - 3905
  • [26] A New Particle Swarm Optimization Algorithm for Neural Network Optimization
    Ling, S. H.
    Nguyen, Hung T.
    Chan, K. Y.
    NSS: 2009 3RD INTERNATIONAL CONFERENCE ON NETWORK AND SYSTEM SECURITY, 2009, : 516 - +
  • [27] A parallel algorithm for gradient training of feedforward neural networks
    Hanzalek, Z
    PARALLEL COMPUTING, 1998, 24 (5-6) : 823 - 839
  • [28] Training neural networks using Multiobjective Particle Swarm Optimization
    Yusiong, John Paul T.
    Naval, Prospero C., Jr.
    ADVANCES IN NATURAL COMPUTATION, PT 1, 2006, 4221 : 879 - 888
  • [29] Parallel nonlinear optimization techniques for training neural networks
    Phua, PKH
    Ming, DH
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (06): : 1460 - 1468
  • [30] Modifications of the Givens Training Algorithm for Artificial Neural Networks
    Bilski, Jaroslaw
    Kowalczyk, Bartosz
    Cader, Andrzej
    ARTIFICIAL INTELLIGENCEAND SOFT COMPUTING, PT I, 2019, 11508 : 14 - 28