An Approach to Hyperparameter Optimization for the Objective Function in Machine Learning

被引:14
|
作者
Kim, Yonghoon [1 ]
Chung, Mokdong [1 ]
机构
[1] Pukyong Natl Univ, Dept Comp Engn, Pusan 48513, South Korea
基金
新加坡国家研究基金会;
关键词
bayesian optimization; gaussian process; learning rate; acauisition function; machine learning; GLOBAL OPTIMIZATION;
D O I
10.3390/electronics8111267
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In machine learning, performance is of great value. However, each learning process requires much time and effort in setting each parameter. The critical problem in machine learning is determining the hyperparameters, such as the learning rate, mini-batch size, and regularization coefficient. In particular, we focus on the learning rate, which is directly related to learning efficiency and performance. Bayesian optimization using a Gaussian Process is common for this purpose. In this paper, based on Bayesian optimization, we attempt to optimize the hyperparameters automatically by utilizing a Gamma distribution, instead of a Gaussian distribution, to improve the training performance of predicting image discrimination. As a result, our proposed method proves to be more reasonable and efficient in the estimation of learning rate when training the data, and can be useful in machine learning.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Hyperparameter optimization in learning systems
    Andonie, Razvan
    JOURNAL OF MEMBRANE COMPUTING, 2019, 1 (04) : 279 - 291
  • [22] Learning Hyperparameter Optimization Initializations
    Wistuba, Martin
    Schilling, Nicolas
    Schmidt-Thieme, Lars
    PROCEEDINGS OF THE 2015 IEEE INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (IEEE DSAA 2015), 2015, : 339 - 348
  • [23] Hyperparameter optimization in learning systems
    Răzvan Andonie
    Journal of Membrane Computing, 2019, 1 : 279 - 291
  • [24] Understanding the Effect of Hyperparameter Optimization on Machine Learning Models for Structure Design Problems
    Du, Xianping
    Xu, Hongyi
    Zhu, Feng
    COMPUTER-AIDED DESIGN, 2021, 135
  • [25] Hyperparameter optimization: Classics, acceleration, online, multi-objective, and tools
    Tan J.M.
    Liao H.
    Liu W.
    Fan C.
    Huang J.
    Liu Z.
    Yan J.
    Mathematical Biosciences and Engineering, 2024, 21 (06) : 6289 - 6335
  • [26] A Collection of Quality Diversity Optimization Problems Derived from Hyperparameter Optimization of Machine Learning Models
    Schneider, Lennart
    Pfisterer, Florian
    Thomas, Janek
    Bischl, Bernd
    PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2022, 2022, : 2136 - 2142
  • [27] Credit Default Risk Analysis Using Machine Learning Algorithms with Hyperparameter Optimization
    Inga, Juan
    Sacoto-Cabrera, Erwin
    INTELLIGENT TECHNOLOGIES: DESIGN AND APPLICATIONS FOR SOCIETY, CITIS 2022, 2023, 607 : 81 - 95
  • [28] Machine Learning-based Test Case Prioritization using Hyperparameter Optimization
    Khan, Md Asif
    Azim, Akramul
    Liscano, Ramiro
    Smith, Kevin
    Tauseef, Qasim
    Seferi, Gkerta
    Chang, Yee-Kang
    PROCEEDINGS OF THE 2024 IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATION OF SOFTWARE TEST, AST 2024, 2024, : 125 - 135
  • [29] Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning: Analysis of the Black-Box Optimization Challenge 2020
    Turner, Ryan
    Eriksson, David
    McCourt, Michael
    Kiili, Juha
    Laaksonen, Eero
    Xu, Zhen
    Guyon, Isabelle
    NEURIPS 2020 COMPETITION AND DEMONSTRATION TRACK, VOL 133, 2020, 133 : 3 - 26
  • [30] Automated Hyperparameter Tuning and Ensemble Machine Learning Approach for Network Traffic Classification
    Chen, Liwei
    Sun, Xiu
    Li, Yuchan
    Jaseemuddin, Muhammad
    Kazi, Baha Uddin
    19TH IEEE INTERNATIONAL SYMPOSIUM ON BROADBAND MULTIMEDIA SYSTEMS AND BROADCASTING, BMSB 2024, 2024, : 690 - 695