Application of a Stochastic Schemata Exploiter for Multi-Objective Hyper-parameter Optimization of Machine Learning

被引:0
|
作者
Makino, Hiroya [1 ]
Kita, Eisuke [1 ]
机构
[1] Nagoya Univ, Grad Sch Informat, Nagoya, Japan
来源
REVIEW OF SOCIONETWORK STRATEGIES | 2023年 / 17卷 / 02期
关键词
AutoML; Stochastic schemata exploiter; Evolutionary algorithm; Hyper-parameter optimization; REGRESSION SHRINKAGE; SELECTION; ALGORITHM; SEARCH;
D O I
10.1007/s12626-023-00151-1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Stochastic Schemata Exploiter (SSE), one of the Evolutionary Algorithms, is designed to find the optimal solution of a function. SSE extracts common schemata from individual sets with high fitness and generates individuals from the common schemata. For hyper-parameter optimization, the initialization method, the schema extraction method, and the new individual generation method, which are characteristic processes in SSE, are extended. In this paper, an SSE-based multi-objective optimization for AutoML is proposed. AutoML gives good results in terms of model accuracy. However, if only model accuracy is considered, the model may be too complex. Such complex models cannot always be allowed because of the long computation time. The proposed method maximizes the stacking model accuracy and minimizes the model complexity simultaneously. When compared with existing methods, SSE has interesting features such as fewer control parameters and faster convergence properties. The visualization method makes the optimization process transparent and helps users understand the process.
引用
收藏
页码:179 / 213
页数:35
相关论文
共 50 条
  • [1] Application of a Stochastic Schemata Exploiter for Multi-Objective Hyper-parameter Optimization of Machine Learning
    Hiroya Makino
    Eisuke Kita
    The Review of Socionetwork Strategies, 2023, 17 : 179 - 213
  • [2] Learning networks hyper-parameter using multi-objective optimization of statistical performance metrics
    Torres, Guillermo
    Sanchez, Carles
    Gil, Debora
    2022 24TH INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING, SYNASC, 2022, : 233 - 238
  • [3] HyperASPO: Fusion of Model and Hyper Parameter Optimization for Multi-objective Machine Learning
    Kannan, Aswin
    Choudhury, Anamitra Roy
    Saxena, Vaibhav
    Raje, Saurabh
    Ram, Parikshit
    Verma, Ashish
    Sabharwal, Yogish
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 790 - 800
  • [4] Neural Networks Designing Neural Networks: Multi-Objective Hyper-Parameter Optimization
    Smithson, Sean C.
    Yang, Guang
    Gross, Warren J.
    Meyer, Brett H.
    2016 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN (ICCAD), 2016,
  • [5] Multi-objective simulated annealing for hyper-parameter optimization in convolutional neural networks
    Gulcu, Ayla
    Kus, Zeki
    PEERJ COMPUTER SCIENCE, 2021, 7 : 2 - 27
  • [6] Annealing of Monel 400 Alloy Using Principal Component Analysis, Hyper-parameter Optimization, Machine Learning Techniques, and Multi-objective Particle Swarm Optimization
    Chintakindi, Sanjay
    Alsamhan, Ali
    Abidi, Mustufa Haider
    Kumar, Maduri Praveen
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2022, 15 (01)
  • [7] Annealing of Monel 400 Alloy Using Principal Component Analysis, Hyper-parameter Optimization, Machine Learning Techniques, and Multi-objective Particle Swarm Optimization
    Sanjay Chintakindi
    Ali Alsamhan
    Mustufa Haider Abidi
    Maduri Praveen Kumar
    International Journal of Computational Intelligence Systems, 15
  • [8] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [9] A new hyper-parameter optimization method for machine learning in fault classification
    Ye, Xingchen
    Gao, Liang
    Li, Xinyu
    Wen, Long
    APPLIED INTELLIGENCE, 2023, 53 (11) : 14182 - 14200
  • [10] A new hyper-parameter optimization method for machine learning in fault classification
    Xingchen Ye
    Liang Gao
    Xinyu Li
    Long Wen
    Applied Intelligence, 2023, 53 : 14182 - 14200