Cost-sensitive ensemble learning: a unifying framework

被引:0
作者
George Petrides
Wouter Verbeke
机构
[1] University of Bergen,
[2] Vrije Universiteit Brussel (VUB),undefined
来源
Data Mining and Knowledge Discovery | 2022年 / 36卷
关键词
Cost-sensitive learning; Class imbalance; Classification; Misclassification cost;
D O I
暂无
中图分类号
学科分类号
摘要
Over the years, a plethora of cost-sensitive methods have been proposed for learning on data when different types of misclassification errors incur different costs. Our contribution is a unifying framework that provides a comprehensive and insightful overview on cost-sensitive ensemble methods, pinpointing their differences and similarities via a fine-grained categorization. Our framework contains natural extensions and generalisations of ideas across methods, be it AdaBoost, Bagging or Random Forest, and as a result not only yields all methods known to date but also some not previously considered.
引用
收藏
页码:1 / 28
页数:27
相关论文
共 77 条
[1]  
Ayer M(1955)An empirical distribution function for sampling with incomplete information Ann Math Stat 26 641-647
[2]  
Brunk H(2015)Example-dependent cost-sensitive decision trees Expert Syst Appl 42 6609-6619
[3]  
Ewing G(1996)Bagging predictors Mach Learn 26 123-140
[4]  
Reid W(2001)Random forests Mach Learn 45 5-32
[5]  
Silverman E(2002)SMOTE: synthetic minority over-sampling technique J Artif Intell Res 16 321-357
[6]  
Bahnsen AC(2014)Improving customer retention management through cost-sensitive learning Eur J Mark 48 477-495
[7]  
Aouada D(1997)A decision–theoretic generalization of on-line learning and an application to boosting J Comput Syst Sci 55 119-139
[8]  
Ottersten B(2001)Greedy function approximation: a gradient boosting machine Ann Stat 29 1189-1232
[9]  
Breiman L(2000)Additive logistic regression: a statistical view of boosting Ann Stat 28 337-374
[10]  
Breiman L(2011)A review on ensembles for the class imbalance problem: bagging-, boosting-, and hybrid-based approaches IEEE Trans Syst Man Cybern Part C Appl Rev 42 463-484