Hyperparameter recommendation via automated meta-feature selection embedded with kernel group Lasso learning

被引:0
作者
Deng, Liping [1 ]
Xiao, Mingqing [2 ]
机构
[1] Univ Calif Riverside, Dept Math, Riverside, CA 92521 USA
[2] Southern Illinois Univ, Sch Math & Stat Sci, Carbondale, IL 62901 USA
关键词
Hyperparameter recommendation; Meta-feature selection; Meta-learning; Multivariate kernel group Lasso; ALGORITHMS; SEARCH; MODELS;
D O I
10.1016/j.knosys.2024.112706
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyperparameter recommendation via meta-learning relies on the characterization and quality of meta-features. These meta-features provide critical information about the underlying datasets but are often selected manually based on the practitioner's experience and preference, which can be inefficient and ineffective in many applications. In this paper, we propose a novel hyperparameter recommendation approach that integrates with a Lasso-based multivariate kernel group (KGLasso) model. The developed KGLasso model automatically identifies primary meta-features through model training. By selecting the most explanatory meta-features for a specific meta-learning task, the recommendation performance becomes much more effective. Our KGLasso model builds on a group-wise generalized multivariate Lasso approach. Within this framework, we establish a minimization algorithm using a corresponding auxiliary function, which is mathematically proven to be convergent and robust. As an application, we develop a hyperparameter recommendation system using our built KGLasso model on 120 UCI datasets for the well-known support vector machine (SVM) algorithm. This system efficiently provides competent hyperparameter recommendations for new tasks. Extensive experiments, including comparisons with popular meta-learning baselines and search algorithms, demonstrate the superiority of our proposed approach. Our results highlight the benefits of integrating model learning and feature selection to construct an automated meta-learner for hyperparameter recommendation in meta-learning.
引用
收藏
页数:11
相关论文
共 48 条
[1]  
Alcobaça E, 2020, J MACH LEARN RES, V21
[2]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[3]   ON THE PREDICTIVE POWER OF META-FEATURES IN OPENML [J].
Bilalli, Besim ;
Abello, Alberto ;
Aluja-Banet, Tomas .
INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2017, 27 (04) :697-712
[4]   Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges [J].
Bischl, Bernd ;
Binder, Martin ;
Lang, Michel ;
Pielok, Tobias ;
Richter, Jakob ;
Coors, Stefan ;
Thomas, Janek ;
Ullmann, Theresa ;
Becker, Marc ;
Boulesteix, Anne-Laure ;
Deng, Difan ;
Lindauer, Marius .
WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 13 (02)
[5]  
Brazdil P., 2022, Metalearning: Applications to Automated Machine Learning and Data Mining, DOI DOI 10.1007/978-3
[6]  
Brazdil P., 2022, Metalearning, P77
[7]  
Chen H, 2017, ADV NEUR IN, V30
[8]   Feature Selection: Filter Methods Performance Challenges [J].
Cherrington, Marianne ;
Thabtah, Fadi ;
Lu, Joan ;
Xu, Qiang .
2019 INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION SCIENCES (ICCIS), 2019, :252-255
[9]  
Cowen-Rivers AI, 2022, J ARTIF INTELL RES, V74, P1269
[10]   META-DES.Oracle: Meta-learning and feature selection for dynamic ensemble selection [J].
Cruz, Rafael M. O. ;
Sabourin, Robert ;
Cavalcanti, George D. C. .
INFORMATION FUSION, 2017, 38 :84-103