PSO-GA based hybrid with Adam Optimization for ANN training with application in Medical Diagnosis

被引:54
作者
Yadav, Rajesh K. [1 ]
Anubhav [1 ]
机构
[1] Delhi Technol Univ, Dept Comp Sci & Engn, Delhi, India
来源
COGNITIVE SYSTEMS RESEARCH | 2020年 / 64卷
关键词
ANNs (Artificial Neural Networks); PSO (Particle Swarm Optimization); GA (Genetic Algorithm); Gradient Descent; BP (Backward Propagation); Adam Optimization; Medical Diagnosis; EVOLUTIONARY ALGORITHMS;
D O I
10.1016/j.cogsys.2020.08.011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces a novel PSO-GA based hybrid training algorithm with Adam Optimization and contrasts performance with the generic Gradient Descent based Backpropagation algorithm with Adam Optimization for training Artificial Neural Networks. We aim to overcome the shortcomings of the traditional algorithm, such as slower convergence rate and frequent convergence to local minima, by employing the characteristics of evolutionary algorithms. PSO has a property of faster convergence rate, which can be exploited to account for the slower pace of convergence of the traditional BP (which is due to low values of gradients). In contrast, the integration with GA complements the drawback of convergence to local minima as GA, possesses the capability of efficient global search. So by this integration of these algorithms, we propose our new hybrid algorithm for training ANNs. We compare both the algorithms for the application of medical diagnosis. Results display that the proposed hybrid training algorithm, significantly outperforms the traditional training algorithm, by enhancing the accuracies of the ANNs with an increase of 20% in the average testing accuracy and 0.7% increase in the best testing accuracy. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:191 / 199
页数:9
相关论文
共 32 条
  • [1] Hybrid GA-PSO Optimization of Artificial Neural Network for Forecasting Electricity Demand
    Anand, Atul
    Suganthi, L.
    [J]. ENERGIES, 2018, 11 (04)
  • [2] Angeline P., 1998, Seventh Annual Conference on Evolutionary Programming, San Diego, USA, 25 -27 Mar 1998, P601, DOI DOI 10.1007/BFB0040753
  • [3] [Anonymous], 2002, ACS 2002 9 INT C P
  • [4] [Anonymous], 2009, CVPR09
  • [5] [Anonymous], 2016, IEEE INT C REC TREND
  • [6] [Anonymous], 1991, NEUROCOMPUTING
  • [7] Arami H., 2010, INT C STUD APPL ENG
  • [8] An Overview of Evolutionary Algorithms for Parameter Optimization
    Baeck, Thomas
    Schwefel, Hans-Paul
    [J]. EVOLUTIONARY COMPUTATION, 1993, 1 (01) : 1 - 23
  • [9] Banyai T., 2020, IND 4 0 IMPACT INTEL, DOI DOI 10.5772/INTECHOPEN.89152
  • [10] Buchanan BG., 1984, RULE BASED EXPERT SY