Application of a Stochastic Schemata Exploiter for Multi-Objective Hyper-parameter Optimization of Machine Learning

被引:1
作者
Makino, Hiroya [1 ]
Kita, Eisuke [1 ]
机构
[1] Nagoya Univ, Grad Sch Informat, Nagoya, Japan
关键词
AutoML; Stochastic schemata exploiter; Evolutionary algorithm; Hyper-parameter optimization; REGRESSION SHRINKAGE; SELECTION; ALGORITHM; SEARCH;
D O I
10.1007/s12626-023-00151-1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Stochastic Schemata Exploiter (SSE), one of the Evolutionary Algorithms, is designed to find the optimal solution of a function. SSE extracts common schemata from individual sets with high fitness and generates individuals from the common schemata. For hyper-parameter optimization, the initialization method, the schema extraction method, and the new individual generation method, which are characteristic processes in SSE, are extended. In this paper, an SSE-based multi-objective optimization for AutoML is proposed. AutoML gives good results in terms of model accuracy. However, if only model accuracy is considered, the model may be too complex. Such complex models cannot always be allowed because of the long computation time. The proposed method maximizes the stacking model accuracy and minimizes the model complexity simultaneously. When compared with existing methods, SSE has interesting features such as fewer control parameters and faster convergence properties. The visualization method makes the optimization process transparent and helps users understand the process.
引用
收藏
页码:179 / 213
页数:35
相关论文
共 51 条
[1]  
Aizawa A. N., 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence (Cat. No.94TH0650-2), P525, DOI 10.1109/ICEC.1994.349895
[2]   Evolving SSE: A new population-oriented search scheme based on schemata processing [J].
Aizawa, AN .
SYSTEMS AND COMPUTERS IN JAPAN, 1996, 27 (02) :41-52
[3]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[4]   Pymoo: Multi-Objective Optimization in Python']Python [J].
Blank, Julian ;
Deb, Kalyanmoy .
IEEE ACCESS, 2020, 8 :89497-89509
[5]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[6]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[7]   Modeling wine preferences by data mining from physicochemical properties [J].
Cortez, Paulo ;
Cerdeira, Antonio ;
Almeida, Fernando ;
Matos, Telmo ;
Reis, Jose .
DECISION SUPPORT SYSTEMS, 2009, 47 (04) :547-553
[8]   A fast and elitist multiobjective genetic algorithm: NSGA-II [J].
Deb, K ;
Pratap, A ;
Agarwal, S ;
Meyarivan, T .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (02) :182-197
[9]  
Elsken T, 2019, SPRING SER CHALLENGE, P63, DOI 10.1007/978-3-030-05318-5_3
[10]  
Feurer M, 2019, SPRING SER CHALLENGE, P3, DOI 10.1007/978-3-030-05318-5_1