Weighted Random Search for CNN Hyperparameter Optimization

被引:25
作者
Andonie, R. [1 ,2 ]
Florea, A. C. [2 ]
机构
[1] Cent Washington Univ, Dept Comp Sci, Ellensburg, WA 98926 USA
[2] Transilvania Univ Brasov, Dept Elect & Comp, Brasov, Romania
关键词
Hyperparameter optimization; supervised learning; random search; convolutional neural networks;
D O I
10.15837/ijccc.2020.2.3868
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Nearly all model algorithms used in machine learning use two different sets of parameters: the training parameters and the meta-parameters (hyperparameters). While the training parameters are learned during the training phase, the values of the hyperparameters have to be specified before learning starts. For a given dataset, we would like to find the optimal combination of hyperparameter values, in a reasonable amount of time. This is a challenging task because of its computational complexity. In previous work [11], we introduced the Weighted Random Search (WRS) method, a combination of Random Search (RS) and probabilistic greedy heuristic. In the current paper, we compare the WRS method with several state-of-the art hyperparameter optimization methods with respect to Convolutional Neural Network (CNN) hyperparameter optimization. The criterion is the classification accuracy achieved within the same number of tested combinations of hyperparameter values. According to our experiments, the WRS algorithm outperforms the other methods.
引用
收藏
页数:11
相关论文
共 42 条
  • [1] A Framework for Designing the Architectures of Deep Convolutional Neural Networks
    Albelwi, Saleh
    Mahmood, Ausif
    [J]. ENTROPY, 2017, 19 (06)
  • [2] [Anonymous], 2018, CORR
  • [3] [Anonymous], 2018, Model Evaluation, Model Selection, and Algorithm Selection in Machine Learning
  • [4] [Anonymous], J MEMBRANE COMPUTING
  • [5] [Anonymous], PROC CVPR IEEE
  • [6] [Anonymous], CORR
  • [7] [Anonymous], 2016, CORR
  • [8] [Anonymous], 2012, ADV NEURAL INF PROCE
  • [9] [Anonymous], ADV NEURAL INFORM PR
  • [10] [Anonymous], 2018, CORR