A new parallel galactic swarm optimization algorithm for training artificial neural networks

被引:4
|
作者
Bhardwaj, Shubham [1 ]
Amali, Geraldine Bessie D. [2 ]
Phadke, Amrut [2 ]
Umadevi, K. S. [2 ]
Balakrishnan, P. [2 ]
机构
[1] Reliance Jio, Hyderabad, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Vellore, Tamil Nadu, India
关键词
nature inspired metaheuristic; parallel computation; galactic swarm optimization; artificial neural networks;
D O I
10.3233/JIFS-179747
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Metaheuristic algorithms are a family of algorithms that help solve NP-hard problems by providing near-optimal solutions in a reasonable amount of time. Galactic Swarm Optimization (GSO) is the state-of-the-art metaheuristic algorithm that takes inspiration from the motion of stars and galaxies under the influence of gravity. In this paper, a new scalable algorithm is proposed to help overcome the inherent sequential nature of GSO and helps the modified version of the GSO algorithm to utilize the full computing capacity of the hardware efficiently. The modified algorithm includes new features to tackle the problem of training an Artificial Neural Network. The proposed algorithm is compared with Stochastic Gradient Descent based on performance and accuracy. The algorithm's performance was evaluated based on per-CPU utilization on multiple platforms. Experimental results have shown that PGSO outperforms GSO and other competitors like PSO in a variety of challenging settings.
引用
收藏
页码:6691 / 6701
页数:11
相关论文
共 50 条
  • [1] COOT optimization algorithm on training artificial neural networks
    Ozden, Aysenur
    Iseri, Ismail
    KNOWLEDGE AND INFORMATION SYSTEMS, 2023, 65 (08) : 3353 - 3383
  • [2] COOT optimization algorithm on training artificial neural networks
    Ayşenur Özden
    İsmail İşeri
    Knowledge and Information Systems, 2023, 65 : 3353 - 3383
  • [3] APPLICATION OF PARTICLE SWARM OPTIMIZATION ALGORITHM IN PROCESS OF ARTIFICIAL NEURAL NETWORKS TRAINING FOR SHORT TERM FORECASTING
    Baczynski, Dariusz
    RYNEK ENERGII, 2010, (04): : 52 - 56
  • [4] A hybrid of artificial fish swarm algorithm and particle swarm optimization for feedforward neural network training
    Chen, Huadong
    Wang, Shuzong
    Li, Jingxi
    Li, Yunfan
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS AND KNOWLEDGE ENGINEERING (ISKE 2007), 2007,
  • [5] A Distributed Particle Swarm Optimization Algorithm Based on Apache Spark for Asynchronous Parallel Training of Deep Neural Networks
    Capel, Manuel, I
    Holgado-Terriza, Juan A.
    Galiana-Velasco, Sergio
    Salguero, Alberto G.
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 76 - 85
  • [6] Galactic Swarm Optimization using Artificial Bee Colony Algorithm
    Kaya, Ersin
    Babaoglu, Ismail
    Kodaz, Halife
    2017 15TH INTERNATIONAL CONFERENCE ON ICT AND KNOWLEDGE ENGINEERING (ICT&KE), 2017, : 23 - 28
  • [7] Design of Artificial Neural Networks using a Modified Particle Swarm Optimization Algorithm
    Garro, Beatriz A.
    Sossa, Humberto
    Vazquez, Roberto A.
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 2363 - 2370
  • [8] Fitness and Diversity Guided Particle Swarm Optimization for Global Optimization and Training Artificial Neural Networks
    Zhang, Xueyan
    Li, Lin
    Zhang, Yuzhu
    Yang, Guocai
    PROCEEDINGS OF THE 2016 INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC), VOL 1, 2016, : 74 - 81
  • [9] Training Optimization for Artificial Neural Networks
    Toribio Luna, Primitivo
    Alejo Eleuterio, Roberto
    Valdovinos Rosas, Rosa Maria
    Rodriguez Mendez, Benjamin Gonzalo
    CIENCIA ERGO-SUM, 2010, 17 (03) : 313 - 317
  • [10] A new constructive algorithm for designing and training artificial neural networks
    Sattar, Md. Abdus
    Islam, Md. Monirul
    Murase, Kazuyuki
    NEURAL INFORMATION PROCESSING, PART I, 2008, 4984 : 317 - +