Guided Convergence for Training Feed-forward Neural Network using Novel Gravitational Search Optimization

被引:0
|
作者
Saha, Sankhadip [1 ]
Chakraborty, Dwaipayan [2 ]
Dutta, Oindrilla [1 ]
机构
[1] NetajiSubhash Engn Coll, Dept Elect Engn, Kolkata, India
[2] NetajiSubhash Engn Coll, Dept Elect & Instru Engn, Kolkata, India
来源
2014 INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND APPLICATIONS (ICHPCA) | 2014年
关键词
Meta-heuristic; optimization; GSA; feed-forward neural network; local minima; ALGORITHM; BACKPROPAGATION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Training of feed-forward neural network using stochastic optimizationtechniques recently gained a lot of importance invarious pattern recognition and data miningapplications because of its capability of escaping local minima trap. However such techniques may suffer fromslow and poor convergence. This fact inspires us to work onmeta-heuristic optimization technique for training the neural network. In this respect, to train the neural network, we focus on implementing thegravitational search algorithm(GSA) which is based on the Newton's law of motion principle and the interaction of masses. GSA has good ability to search for the global optimum, but it may suffer from slow searching speed in the lastiterations. Our work is directed towards the smart convergence by modifying the original GSA and also guiding the algorithm to make it immune to local minima trap. Results on various benchmark datasets prove the robustness of the modified algorithm.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] Re-configurable parallel Feed-Forward Neural Network implementation using FPGA
    El-Sharkawy, Mohamed
    Wael, Miran
    Mashaly, Maggie
    Azab, Eman
    INTEGRATION-THE VLSI JOURNAL, 2024, 97
  • [42] A Feed-Forward Neural Network for Increasing the Hopfield-Network Storage Capacity
    Zhao, Shaokai
    Chen, Bin
    Wang, Hui
    Luo, Zhiyuan
    Zhang, Tao
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2022, 32 (06)
  • [43] Feed-forward neural network optimized by hybridization of PSO and ABC for abnormal brain detection
    Wang, Shuihua
    Zhang, Yudong
    Dong, Zhengchao
    Du, Sidan
    Ji, Genlin
    Yan, Jie
    Yang, Jiquan
    Wang, Qiong
    Feng, Chunmei
    Phillips, Preetha
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2015, 25 (02) : 153 - 164
  • [44] Feed-Forward Neural Networks Training with Hybrid Taguchi Vortex Search Algorithm for Transmission Line Fault Classification
    Coban, Melih
    Tezcan, Suleyman Sungur
    MATHEMATICS, 2022, 10 (18)
  • [45] Discovery of Optimal Neurons and Hidden Layers in Feed-Forward Neural Network
    Thomas, Likewin
    Kumar, Manoj M., V
    Annappa, B.
    2016 IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND INNOVATIVE BUSINESS PRACTICES FOR THE TRANSFORMATION OF SOCIETIES (EMERGITECH), 2016, : 286 - 291
  • [46] Compiler Fuzzing Test Case Generation with Feed-forward Neural Network
    Xu H.-R.
    Wang Y.-J.
    Huang Z.-J.
    Xie P.-D.
    Fan S.-H.
    Ruan Jian Xue Bao/Journal of Software, 2022, 33 (06): : 1996 - 2011
  • [47] Research and application of quantum-inspired double parallel feed-forward neural network
    Ma, Yunpeng
    Niu, Peifeng
    Zhang, Xinxin
    Li, Guoqiang
    KNOWLEDGE-BASED SYSTEMS, 2017, 136 : 140 - 149
  • [48] Discriminating between distributions using feed-forward neural networks
    di Bella, Enrico
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2015, 85 (04) : 711 - 724
  • [49] Training of Feed-Forward Neural Networks by Using Optimization Algorithms Based on Swarm-Intelligent for Maximum Power Point Tracking
    Kaya, Ebubekir
    Kaya, Ceren Bastemur
    Bendes, Emre
    Atasever, Sema
    Ozturk, Basak
    Yazlik, Bilgin
    BIOMIMETICS, 2023, 8 (05)
  • [50] Software reliability testing coverage model using feed-forward back propagation neural network
    Bibyan, Ritu
    Anand, Sameer
    Jaiswal, Ajay
    Aggarwal, Anu Gupta
    INTERNATIONAL JOURNAL OF MODELLING IDENTIFICATION AND CONTROL, 2023, 43 (02) : 126 - 133