A new parallel galactic swarm optimization algorithm for training artificial neural networks

被引:4
|
作者
Bhardwaj, Shubham [1 ]
Amali, Geraldine Bessie D. [2 ]
Phadke, Amrut [2 ]
Umadevi, K. S. [2 ]
Balakrishnan, P. [2 ]
机构
[1] Reliance Jio, Hyderabad, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Vellore, Tamil Nadu, India
关键词
nature inspired metaheuristic; parallel computation; galactic swarm optimization; artificial neural networks;
D O I
10.3233/JIFS-179747
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Metaheuristic algorithms are a family of algorithms that help solve NP-hard problems by providing near-optimal solutions in a reasonable amount of time. Galactic Swarm Optimization (GSO) is the state-of-the-art metaheuristic algorithm that takes inspiration from the motion of stars and galaxies under the influence of gravity. In this paper, a new scalable algorithm is proposed to help overcome the inherent sequential nature of GSO and helps the modified version of the GSO algorithm to utilize the full computing capacity of the hardware efficiently. The modified algorithm includes new features to tackle the problem of training an Artificial Neural Network. The proposed algorithm is compared with Stochastic Gradient Descent based on performance and accuracy. The algorithm's performance was evaluated based on per-CPU utilization on multiple platforms. Experimental results have shown that PGSO outperforms GSO and other competitors like PSO in a variety of challenging settings.
引用
收藏
页码:6691 / 6701
页数:11
相关论文
共 50 条
  • [41] Numerical optimization and feed-forward neural networks training using an improved optimization algorithm: multiple leader salp swarm algorithm
    Bairathi, Divya
    Gopalani, Dinesh
    EVOLUTIONARY INTELLIGENCE, 2021, 14 (03) : 1233 - 1249
  • [42] The Optimization of Fuzzy Neural Network Based on Artificial Fish Swarm Algorithm
    Lei Yanmin
    Feng Zhibin
    2013 IEEE NINTH INTERNATIONAL CONFERENCE ON MOBILE AD-HOC AND SENSOR NETWORKS (MSN 2013), 2013, : 469 - 473
  • [43] New global optimization algorithm for training feedforward neural networks and its application
    Li, Huan-Qin
    Wan, Bai-Wu
    Xitong Gongcheng Lilun yu Shijian/System Engineering Theory and Practice, 2003, 23 (08):
  • [44] A study on genetic algorithm optimization of artificial neural networks
    Zhong H.
    He G.
    Huo Y.
    Xie C.
    International Journal of Simulation: Systems, Science and Technology, 2016, 17 (25): : 37.1 - 37.6
  • [45] An incremental parallel tangent learning algorithm for artificial neural networks
    Nezami, AR
    Bhavsar, VC
    Ghorbani, AA
    1997 CANADIAN CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING, CONFERENCE PROCEEDINGS, VOLS I AND II: ENGINEERING INNOVATION: VOYAGE OF DISCOVERY, 1997, : 301 - 304
  • [46] A hybrid artificial neural networks and particle swarm optimization for function approximation
    Su, Tejen
    Jhang, Jyunwei
    Hou, Chengchih
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2008, 4 (09): : 2363 - 2374
  • [47] An Improved Particle Swarm Optimization for Evolving Feedforward Artificial Neural Networks
    Jianbo Yu
    Lifeng Xi
    Shijin Wang
    Neural Processing Letters, 2007, 26 : 217 - 231
  • [48] An improved particle swarm optimization for evolving feedforward artificial neural networks
    Yu, Jianbo
    Xi, Lifeng
    Wang, Shijin
    NEURAL PROCESSING LETTERS, 2007, 26 (03) : 217 - 231
  • [49] A Survey on the Optimization of Artificial Neural Networks Using Swarm Intelligence Algorithms
    Emambocus, Bibi Aamirah Shafaa
    Jasser, Muhammed Basheer
    Amphawan, Angela
    IEEE ACCESS, 2023, 11 : 1280 - 1294
  • [50] Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms
    Garro, Beatriz A.
    Vazquez, Roberto A.
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2015, 2015