RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm

被引:2
|
作者
Munoz Castaneda, Angel Luis [1 ,2 ]
DeCastro-Garcia, Noemi [1 ,2 ]
Escudero Garcia, David [2 ]
机构
[1] Univ Leon, Dept Math, Leon 24007, Spain
[2] Univ Leon, Res Inst Appl Sci Cybersecur RIASC, Leon 24007, Spain
关键词
hyperparameters; machine learning; optimization; inference; PARTICLE SWARM; RANDOM SEARCH;
D O I
10.3390/math9182334
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This work proposes a new algorithm for optimizing hyper-parameters of a machine learning algorithm, RHOASo, based on conditional optimization of concave asymptotic functions. A comparative analysis of the algorithm is presented, giving particular emphasis to two important properties: the capability of the algorithm to work efficiently with a small part of a dataset and to finish the tuning process automatically, that is, without making explicit, by the user, the number of iterations that the algorithm must perform. Statistical analyses over 16 public benchmark datasets comparing the performance of seven hyper-parameter optimization algorithms with RHOASo were carried out. The efficiency of RHOASo presents the positive statistically significant differences concerning the other hyper-parameter optimization algorithms considered in the experiments. Furthermore, it is shown that, on average, the algorithm needs around 70% of the iterations needed by other algorithms to achieve competitive performance. The results show that the algorithm presents significant stability regarding the size of the used dataset partition.
引用
收藏
页数:52
相关论文
共 50 条
  • [21] Hyper-parameter Tuning of a Decision Tree Induction Algorithm
    Mantovani, Rafael G.
    Horvath, Tomas
    Cerri, Ricardo
    Vanschoren, Joaquin
    de Carvalho, Andre C. P. L. F.
    PROCEEDINGS OF 2016 5TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2016), 2016, : 37 - 42
  • [22] Research on Hyper-Parameter Optimization of Activity Recognition Algorithm Based on Improved Cuckoo Search
    Tong, Yu
    Yu, Bo
    ENTROPY, 2022, 24 (06)
  • [23] The Tabu_Genetic Algorithm: A Novel Method for Hyper-Parameter Optimization of Learning Algorithms
    Guo, Baosu
    Hu, Jingwen
    Wu, Wenwen
    Peng, Qingjin
    Wu, Fenghe
    ELECTRONICS, 2019, 8 (05)
  • [24] Hyper-Parameter Optimization for Improving the Performance of Grammatical Evolution
    Wang, Hao
    Lou, Yitan
    Back, Thomas
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 2649 - 2656
  • [25] Genetic algorithm hyper-parameter optimization using Taguchi design for groundwater pollution source identification
    Xia, Xuemin
    Jiang, Simin
    Zhou, Nianqing
    Li, Xianwen
    Wang, Lichun
    WATER SUPPLY, 2019, 19 (01) : 137 - 146
  • [26] USING METAHEURISTICS FOR HYPER-PARAMETER OPTIMIZATION OF CONVOLUTIONAL NEURAL NETWORKS
    Bibaeva, Victoria
    2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [27] Hyper-Parameter Optimization for Privacy-Preserving Record Linkage
    Yu, Joyce
    Nabaglo, Jakub
    Vatsalan, Dinusha
    Henecka, Wilko
    Thorne, Brian
    ECML PKDD 2020 WORKSHOPS, 2020, 1323 : 281 - 296
  • [28] HYPER-PARAMETER OPTIMIZATION OF DEEP CONVOLUTIONAL NETWORKS FOR OBJECT RECOGNITION
    Talathi, Sachin S.
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 3982 - 3986
  • [29] Rethinking density ratio estimation based hyper-parameter optimization
    Fan, Zi-En
    Lian, Feng
    Li, Xin-Ran
    NEURAL NETWORKS, 2025, 182
  • [30] Experimental evaluation of stochastic configuration networks: Is SC algorithm inferior to hyper-parameter optimization method?
    Hu, Minghui
    Suganthan, P.N.
    Applied Soft Computing, 2022, 126