Reducing Meta-Level Data in Stacking Ensembles for Classification

被引:0
作者
Bajer, Draen [1 ]
Zoric, Bruno [1 ]
Dudjak, Mario [1 ]
机构
[1] Josip Juraj Strossmayer Univ Osijek, Fac Elect Engn Comp Sci & Informat Technol Osijek, Osijek, Croatia
来源
2024 INTERNATIONAL CONFERENCE ON SMART SYSTEMS AND TECHNOLOGIES, SST | 2024年
关键词
classification; ensembles; meta-learning; metalevel data; stacking;
D O I
10.1109/SST61991.2024.10755194
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Stacking as an ensemble framework relies on metalearning for combining the outputs of multiple base classifiers. What exactly the outputs represent and how they are used for meta-learning is an important aspect of stacking. Commonly, the prediction probabilities produced by the base classifiers at the class level are used for training the meta-classifier. The dimensionality of this meta-level data is, therefore, not negligible. Reducing it would result in the need for a less complex metaclassifier, and would thereby simplify the process of metalearning. This paper presents a straightforward approach for aggregating the prediction probabilities of the base classifiers. By calculating the mean and standard deviation at the class level, a substantial reduction in the dimensionality of the meta-level data is achieved. The experimental analysis, conducted on multiple diverse datasets, suggests that the aggregation preserves sufficient information for meta-learning since better or highly competitive performance was attained with respect to the common stacking ensemble framework (utilising the full meta-level data).
引用
收藏
页码:133 / 138
页数:6
相关论文
共 18 条
[1]  
Alexandropoulos S.A. N., 2019, IFIP Advances in Information and Communication Technology, P545, DOI [10.1007/978-3-030-19823-7_46, DOI 10.1007/978-3-030-19823-7_46]
[2]  
[Anonymous], 2013, UCI MACHINE LEARNING
[3]   A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms [J].
Derrac, Joaquin ;
Garcia, Salvador ;
Molina, Daniel ;
Herrera, Francisco .
SWARM AND EVOLUTIONARY COMPUTATION, 2011, 1 (01) :3-18
[4]  
Dudjak Mario, 2023, 2023 Zooming Innovation in Consumer Technologies Conference (ZINC), P69, DOI 10.1109/ZINC58345.2023.10174224
[5]  
Dzeroski S, 2004, MACH LEARN, V54, P255, DOI 10.1023/B.MAC.0000015881.36452.6e
[6]  
Ghasemieh A., 2023, Decis. Anal. J., V7, DOI [DOI 10.1016/J.DAJOUR.2023.100242, 10.1016/j.dajour.2023.100242]
[7]  
Hasan Tasnimul, 2021, 2021 8th NAFOSTED Conference on Information and Computer Science (NICS), P242, DOI 10.1109/NICS54270.2021.9701526
[8]   SSEM: A Novel Self-Adaptive Stacking Ensemble Model for Classification [J].
Jiang, Weili ;
Chen, Zhenhua ;
Xiang, Yan ;
Shao, Dangguo ;
Ma, Lei ;
Zhang, Junpeng .
IEEE ACCESS, 2019, 7 :120337-120349
[9]  
Kamel Hajer, 2019, 2019 International Engineering Conference (IEC). Proceedings, P165, DOI 10.1109/IEC47844.2019.8950650
[10]   Influence of 12Cr1MoV Material on Tissue Properties at High Temperature and Long Operating Time [J].
Liu, Jiawei ;
Li, Yuanzhe .
PROCESSES, 2022, 10 (02)