Developing a Volunteer Computing Project to Evolve Convolutional Neural Networks and Their Hyperparameters

被引:12
|
作者
Desell, Travis [1 ]
机构
[1] Univ North Dakota, Dept Comp Sci, Grand Forks, ND 58202 USA
来源
2017 IEEE 13TH INTERNATIONAL CONFERENCE ON E-SCIENCE (E-SCIENCE) | 2017年
关键词
D O I
10.1109/eScience.2017.14
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This work presents improvements to a neuroevolution algorithm called Evolutionary eXploration of Augmenting Convolutional Topologies (EXACT), which is capable of evolving the structure of convolutional neural networks (CNNs). While EXACT has multithreaded and parallel implementations, it has also been implemented as part of a volunteer computing project at the Citizen Science Grid to provide truly large scale computing resources through over 5,500 volunteered computers. Improvements include the development of a new mutation operator, which increased the evolution rate by over an order of magnitude and was also shown to be significantly more reliable in generating new CNNs than the traditional method. Further, EXACT has been extended with a simplex hyperparameter optimization (SHO) method which allows for the co-evolution of hyperparameters, simplifying the task of their selection while generating smaller CNNs with similar predictive ability to those generated with fixed hyperparameters. Lastly, the backpropagation method has been updated with batch normalization and dropout. Compared to previous work, which only achieved prediction rates of 98.32% on the MNIST handwritten digits testing data after 60,000 evolved CNNs, these new advances allowed EXACT to achieve prediction rates of 99.43% within only 12,500 evolved CNNs - rates which are comparable to some of the best human designed CNNs.
引用
收藏
页码:19 / 28
页数:10
相关论文
共 50 条
  • [1] On the Selection of Hyperparameters in Convolutional Neural Networks
    Wang, Donglin
    Wu, Qiang
    2021 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI 2021), 2021, : 1728 - 1731
  • [2] Large Scale Evolution of Convolutional Neural Networks Using Volunteer Computing
    Desell, Travis
    PROCEEDINGS OF THE 2017 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCO'17 COMPANION), 2017, : 127 - 128
  • [3] Simplified swarm optimization for hyperparameters of convolutional neural networks
    Yeh, Wei -Chang
    Lin, Yi-Ping
    Liang, Yun-Chia
    Lai, Chyh-Ming
    Huang, Chia -Ling
    COMPUTERS & INDUSTRIAL ENGINEERING, 2023, 177
  • [4] A new hyperparameters optimization method for convolutional neural networks
    Cui, Hua
    Bai, Jie
    PATTERN RECOGNITION LETTERS, 2019, 125 : 828 - 834
  • [5] PBIL for Optimizing Hyperparameters of Convolutional Neural Networks and STL Decomposition
    Vasco-Carofilis, Roberto A.
    Gutierrez-Naranjo, Miguel A.
    Cardenas-Montes, Miguel
    HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2020, 2020, 12344 : 147 - 159
  • [6] Optimizing Hyperparameters for Thai Cuisine Recognition via Convolutional Neural Networks
    Theera-Ampornpunt, Nawanol
    Treepong, Panisa
    TRAITEMENT DU SIGNAL, 2023, 40 (03) : 1187 - 1193
  • [7] Analyzing the effect of hyperparameters in a automobile classifier based on convolutional neural networks
    Riveros, Elian Laura
    Chavez, Jose Galdos
    Caceres, Juan C. Gutierrez
    PROCEEDINGS OF THE 2016 35TH INTERNATIONAL CONFERENCE OF THE CHILEAN COMPUTER SCIENCE SOCIETY (SCCC), 2016,
  • [8] Effects of hyperparameters on flow field reconstruction around a foil by convolutional neural networks
    Wu, Xia
    Wu, Shaobo
    Tian, Xinliang
    Guo, Xiaoxian
    Luo, Xiaofeng
    OCEAN ENGINEERING, 2022, 247
  • [9] Practical hyperparameters tuning of convolutional neural networks for EEG emotional features classification
    Mezzah, Samia
    Tari, Abdelkamel
    INTELLIGENT SYSTEMS WITH APPLICATIONS, 2023, 18
  • [10] Computing nasalance with MFCCs and Convolutional Neural Networks
    Lozano, Andres
    Nava, Enrique
    Garcia Mendez, Maria Dolores
    Moreno-Torres, Ignacio
    PLOS ONE, 2024, 19 (12):