Tri-objective optimization-based cascade ensemble pruning for deep forest; Tri-objective optimization-based cascade ensemble pruning for deep forest

被引:0
作者
Ji J. [1 ]
Li J. [1 ]
机构
[1] Faculty of Information Technology, Beijing University of Technology, Beijing Artificial Intelligence Institute, Beijing Municipal Key Laboratory of Multimedia and Intelligent Software Technology, Beijing
基金
中国国家自然科学基金;
关键词
Coupled diversity; Deep forest; Ensemble learning; Ensemble pruning; Multi-objective optimization;
D O I
10.1016/j.patcog.2023.109744
中图分类号
学科分类号
摘要
Deep forest is a new multi-layer ensemble model, where the high time costs and storage requirements inhibit its large-scale application. However, current deep forest pruning methods used to alleviate these drawbacks do not consider its cascade coupling characteristics. Therefore, we propose a tri-objective optimization-based cascade ensemble pruning (TOOCEP) algorithm for it. Concretely, we first present a tri-objective optimization-based single-layer pruning (TOOSLP) method to prune its single-layer by simultaneously optimizing three objectives, namely accuracy, independent diversity, and coupled diversity. Particularly, the coupled diversity is designed for deep forest to deal with the coupling relationships between its adjacent layers. Then, we perform TOOSLP in a cascade framework to prune the deep forest layer-by-layer. Experimental results on 15 UCI datasets show that TOOCEP outperforms several state-of-the-art methods in accuracy and pruned rate, which significantly reduces the storage space and accelerate the prediction speed of deep forest. © 2023 Elsevier Ltd
引用
收藏
相关论文
共 35 条
[1]  
Breiman L., Random forests, Mach. Learn., 45, 1, pp. 5-32, (2001)
[2]  
Freund Y., Schapire R.E., A decision-theoretic generalization of on-line learning and an application to boosting, Lect. Note. Comput. Sci. (Includ. Subser. Lect. Note. Artif. Intell. Lect. Note. Bioinform.), 904, 1, pp. 23-37, (1995)
[3]  
Geurts P., Ernst D., Wehenkel L., Extremely randomized trees, Mach. Learn., 63, 1, pp. 3-42, (2006)
[4]  
Dietterich T.G., Ensemble methods in machine learning, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), volume 1857 LNCS, pp. 1-15, (2000)
[5]  
Zhou Z.-H., Ensemble learning, Machine Learning, pp. 181-210, (2021)
[6]  
Ykhlef H., Bouchaffra D., An efficient ensemble pruning approach based on simple coalitional games, Inf. Fusion, 34, pp. 28-42, (2017)
[7]  
Margineantu D., Dietterich T.G., Pruning adaptive boosting, Proceedings of the Fourteenth International Conference on Machine Learning, ICML’97, pp. 211-218, (1997)
[8]  
Wang Z., Zhao S., Li Z., Chen H., Shen Y., Ensemble selection with joint spectral clustering and structural sparsity, Pattern Recognit., 119, 2, (2021)
[9]  
Fletcher S., Verma B., Zhang M., A non-specialized ensemble classifier using multi-objective optimization, Neurocomputing, 409, pp. 93-102, (2020)
[10]  
Asadi S., Roshan S.E., A bi-objective optimization method to produce a near-optimal number of classifiers and increase diversity in bagging, Knowl. Based Syst., 213, (2021)