The Tabu_Genetic Algorithm: A Novel Method for Hyper-Parameter Optimization of Learning Algorithms

被引:27
作者
Guo, Baosu [1 ,2 ]
Hu, Jingwen [1 ]
Wu, Wenwen [1 ]
Peng, Qingjin [3 ]
Wu, Fenghe [1 ,2 ]
机构
[1] Yanshan Univ, Sch Mech Engn, Qinhuangdao 066004, Hebei, Peoples R China
[2] Heavy Duty Intelligent Mfg Equipment Innovat Ctr, Qinhuangdao 066004, Hebei, Peoples R China
[3] Univ Manitoba, Dept Mech Engn, Winnipeg, MB R3T 5V6, Canada
基金
中国国家自然科学基金;
关键词
genetic algorithms; machine learning algorithms; neural networks; optimization methods; hyper-parameter optimization;
D O I
10.3390/electronics8050579
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Machine learning algorithms have been widely used to deal with a variety of practical problems such as computer vision and speech processing. But the performance of machine learning algorithms is primarily affected by their hyper-parameters, as without good hyper-parameter values the performance of these algorithms will be very poor. Unfortunately, for complex machine learning models like deep neural networks, it is very difficult to determine their hyper-parameters. Therefore, it is of great significance to develop an efficient algorithm for hyper-parameter automatic optimization. In this paper, a novel hyper-parameter optimization methodology is presented to combine the advantages of a Genetic Algorithm and Tabu Search to achieve the efficient search for hyper-parameters of learning algorithms. This method is defined as the Tabu_Genetic Algorithm. In order to verify the performance of the proposed algorithm, two sets of contrast experiments are conducted. The Tabu_Genetic Algorithm and other four methods are simultaneously used to search for good values of hyper-parameters of deep convolutional neural networks. Experimental results show that, compared to Random Search and Bayesian optimization methods, the proposed Tabu_Genetic Algorithm finds a better model in less time. Whether in a low-dimensional or high-dimensional space, the Tabu_Genetic Algorithm has better search capabilities as an effective method for finding the hyper-parameters of learning algorithms. The presented method in this paper provides a new solution for solving the hyper-parameters optimization problem of complex machine learning models, which will provide machine learning algorithms with better performance when solving practical problems.
引用
收藏
页数:19
相关论文
共 40 条
  • [1] A State-of-the-Art Survey on Deep Learning Theory and Architectures
    Alom, Md Zahangir
    Taha, Tarek M.
    Yakopcic, Chris
    Westberg, Stefan
    Sidike, Paheding
    Nasrin, Mst Shamima
    Hasan, Mahmudul
    Van Essen, Brian C.
    Awwal, Abdul A. S.
    Asari, Vijayan K.
    [J]. ELECTRONICS, 2019, 8 (03)
  • [2] [Anonymous], 2013, P 30 INT C INT C MAC
  • [3] [Anonymous], 2015, DEEP LEARNING NATURE, DOI [10.1038/nature14539, DOI 10.1038/NATURE14539]
  • [4] [Anonymous], 2014, Freeze-thaw bayesian optimization
  • [5] [Anonymous], 2016, CMA ES HYPERPARAMETE
  • [6] [Anonymous], 2011, NEURIPS NEURAL INFOR
  • [7] [Anonymous], 2011, P 24 ADV NEUR INF PR
  • [8] Bergstra J, 2012, J MACH LEARN RES, V13, P281
  • [9] A survey on optimization metaheuristics
    Boussaid, Ilhern
    Lepagnot, Julien
    Siarry, Patrick
    [J]. INFORMATION SCIENCES, 2013, 237 : 82 - 117
  • [10] Chevalier Clement, 2013, Learning and Intelligent Optimization. 7th International Conference, LION 7. Revised Selected Papers: LNCS 7997, P59, DOI 10.1007/978-3-642-44973-4_7