RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm

被引:2
|
作者
Munoz Castaneda, Angel Luis [1 ,2 ]
DeCastro-Garcia, Noemi [1 ,2 ]
Escudero Garcia, David [2 ]
机构
[1] Univ Leon, Dept Math, Leon 24007, Spain
[2] Univ Leon, Res Inst Appl Sci Cybersecur RIASC, Leon 24007, Spain
关键词
hyperparameters; machine learning; optimization; inference; PARTICLE SWARM; RANDOM SEARCH;
D O I
10.3390/math9182334
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This work proposes a new algorithm for optimizing hyper-parameters of a machine learning algorithm, RHOASo, based on conditional optimization of concave asymptotic functions. A comparative analysis of the algorithm is presented, giving particular emphasis to two important properties: the capability of the algorithm to work efficiently with a small part of a dataset and to finish the tuning process automatically, that is, without making explicit, by the user, the number of iterations that the algorithm must perform. Statistical analyses over 16 public benchmark datasets comparing the performance of seven hyper-parameter optimization algorithms with RHOASo were carried out. The efficiency of RHOASo presents the positive statistically significant differences concerning the other hyper-parameter optimization algorithms considered in the experiments. Furthermore, it is shown that, on average, the algorithm needs around 70% of the iterations needed by other algorithms to achieve competitive performance. The results show that the algorithm presents significant stability regarding the size of the used dataset partition.
引用
收藏
页数:52
相关论文
共 50 条
  • [41] Hyper-Parameter Optimization by Using the Genetic Algorithm for Upper Limb Activities Recognition Based on Neural Networks
    Zhang, Junjie
    Sun, Guangmin
    Sun, Yuge
    Dou, Huijing
    Bilal, Anas
    IEEE SENSORS JOURNAL, 2021, 21 (02) : 1877 - 1884
  • [42] Hyper-Parameter Optimization by Using the Genetic Algorithm for Upper Limb Activities Recognition Based on Neural Networks
    Faculty of Information Technology, Beijing University of Technology, Beijing, China
    不详
    IEEE Sensors J., 1600, 2 (1877-1884):
  • [43] Hyper-parameter optimization tools comparison for multiple object tracking applications
    Francisco Madrigal
    Camille Maurice
    Frédéric Lerasle
    Machine Vision and Applications, 2019, 30 : 269 - 289
  • [44] An Experimental Study on Hyper-parameter Optimization for Stacked Auto-Encoders
    Sun, Yanan
    Xue, Bing
    Zhang, Mengjie
    Yen, Gary G.
    2018 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2018, : 638 - 645
  • [45] APPLICATION OF A HYPER-PARAMETER OPTIMIZATION ALGORITHM USING MARS SURROGATE FOR DEEP POLSAR IMAGE CLASSIFICATION MODELS
    Liu, Guangyuan
    Li, Yangyang
    Jiao, Licheng
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 2591 - 2594
  • [46] A new hyper-parameter optimization method for machine learning in fault classification
    Ye, Xingchen
    Gao, Liang
    Li, Xinyu
    Wen, Long
    APPLIED INTELLIGENCE, 2023, 53 (11) : 14182 - 14200
  • [47] Image classification based on KPCA and SVM with randomized hyper-parameter optimization
    Li, Lin
    Lian, Jin
    Wu, Yue
    Ye, Mao
    International Journal of Signal Processing, Image Processing and Pattern Recognition, 2014, 7 (04) : 303 - 316
  • [48] Hyper-parameter optimization for improving the performance of localization in an iterative ensemble smoother
    Luo, Xiaodong
    Cruz, William C.
    Zhang, Xin-Lei
    Xiao, Heng
    GEOENERGY SCIENCE AND ENGINEERING, 2023, 231
  • [49] Facilitating Database Tuning with Hyper-Parameter Optimization: A Comprehensive Experimental Evaluation
    Zhang, Xinyi
    Chang, Zhuo
    Li, Yang
    Wu, Hong
    Tan, Jian
    Li, Feifei
    Cui, Bin
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (09): : 1808 - 1821
  • [50] Particle Swarm Optimization for Hyper-Parameter Selection in Deep Neural Networks
    Lorenzo, Pablo Ribalta
    Nalepa, Jakub
    Kawulok, Michal
    Sanchez Ramos, Luciano
    Ranilla Pastor, Jose
    PROCEEDINGS OF THE 2017 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'17), 2017, : 481 - 488