Towards improving the convolutional neural networks for deep learning using the distributed artificial bee colony method

被引:32
作者
Banharnsakun, Anan [1 ]
机构
[1] Kasetsart Univ, Fac Engn Sriracha, Comp Engn Dept, Computat Intelligence Res Lab CIRLab, Sriracha Campus, Chon Buri 20230, Thailand
关键词
Deep learning; Convolution neural networks; Distributed artificial bee colony; Pattern recognition; Classification; ALGORITHM; CLASSIFIERS;
D O I
10.1007/s13042-018-0811-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
During the past decade, the dramatic increase in the computational capabilities of chip processing and the lower costs of computing hardware have led to the emergence of deep learning, which refers to a sub-field of machine learning that focuses on learning features extracted from data and classifying them through multiple layers in the hierarchical architectures of neural networks. Using convolution neural networks (CNN) is one of the most promising deep learning methods for dealing with several pattern recognition tasks. However, as with most artificial neural networks, CNNs are susceptible to multiple local optima. Hence, in order to avoid becoming trapped within the local optima, improvement of the CNNs is thus required. The optimization methods based on a metaheuristic are very powerful in solving optimization problems. However, research on the use of metaheuristics to optimize CNNs is rarely conducted. In this work, the artificial bee colony (ABC) method, one of the most popular metaheuristic methods, is proposed as an alternative approach to optimizing the performance of a CNN. In other words, we aim to minimize the classification errors by initializing the weights of the CNN classifier based on solutions generated by the ABC method. Moreover, the distributed ABC is also presented as a method to maintain the amount of time needed to execute the process when working with large training datasets. The results of the experiment demonstrate that the proposed method can improve the performance of the ordinary CNNs in both recognition accuracy and computing time.
引用
收藏
页码:1301 / 1311
页数:11
相关论文
共 39 条
  • [1] Albeahdili HM, 2015, INT J ADV COMPUT SC, V6, P79
  • [2] Metaheuristic optimization frameworks: a survey and benchmarking
    Antonio Parejo, Jose
    Ruiz-Cortes, Antonio
    Lozano, Sebastian
    Fernandez, Pablo
    [J]. SOFT COMPUTING, 2012, 16 (03) : 527 - 561
  • [3] A new efficient training strategy for deep neural networks by hybridization of artificial bee colony and limited-memory BFGS optimization algorithms
    Badem, Hasan
    Basturk, Alper
    Caliskan, Abdullah
    Yuksel, Mehmet Emin
    [J]. NEUROCOMPUTING, 2017, 266 : 506 - 526
  • [4] Banharnsakun A., 2010, 2010 Second World Congress on Nature and Biologically Inspired Computing (NaBIC 2011), P13, DOI 10.1109/NABIC.2010.5716309
  • [5] Hybrid ABC-ANN for pavement surface distress detection and classification
    Banharnsakun, Anan
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2017, 8 (02) : 699 - 710
  • [6] Object Detection Based on Template Matching through Use of Best-So-Far ABC
    Banharnsakun, Anan
    Tanathong, Supannee
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2014, 2014
  • [7] Convolutional Neural Networks for Image Steganalysis
    Bashkirova, Dina
    [J]. BIONANOSCIENCE, 2016, 6 (03) : 246 - 248
  • [8] OPTIMIZATION USING SIMULATED ANNEALING
    BROOKS, SP
    MORGAN, BJT
    [J]. STATISTICIAN, 1995, 44 (02): : 241 - 257
  • [9] Performance improvement of deep neural network classifiers by a simple training strategy
    Caliskan, Abdullah
    Yuksel, Mehmet Emin
    Badem, Hasan
    Basturk, Alper
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2018, 67 : 14 - 23
  • [10] Collobert R., 2008, P 25 INT C MACHINE L, P160, DOI [10.1145/1390156.1390177, DOI 10.1145/1390156.1390177]