Evidential Random Forests

被引:21
作者
Hoarau, Arthur [1 ]
Martin, Arnaud [1 ]
Dubois, Jean-Christophe [1 ]
Le Gall, Yolande [1 ]
机构
[1] Univ Rennes, CNRS, IRISA, DRUID, F-22000 Lannion, France
关键词
Decision tree; Random forest; Classification; Rich labels; Dempster-Shafer theory; DECISION TREES;
D O I
10.1016/j.eswa.2023.120652
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In machine learning, some models can make uncertain and imprecise predictions, they are called evidential models. These models may also be able to handle imperfect labeling and take into account labels that are richer than the commonly used hard labels, containing uncertainty and imprecision. This paper proposes an Evidential Decision Tree, and an Evidential Random Forest. These two models use a distance and a degree of inclusion to allow the model to group observations whose response elements are included in each other into a single node. Experimental results showed better performance for the presented methods compared to other evidential models and to recent Cautious Random Forests when the data is noisy. The models also offer a better robustness to the overfitting effect when using datasets that are effectively uncertainly and imprecisely labeled by the contributors. The proposed models are also able to predict rich labels, an information that can be used in other approaches, such as active learning.
引用
收藏
页数:13
相关论文
共 35 条
[1]   Shape quantization and recognition with randomized trees [J].
Amit, Y ;
Geman, D .
NEURAL COMPUTATION, 1997, 9 (07) :1545-1588
[2]  
Bramer M., 2013, Principles of Data Mining, Undergraduate Topics in Computer Science, DOI [DOI 10.1016/j.jcin.2015.10.019, DOI 10.1007/978-1-4471-7493-6_9, 10.1007/978-1-4471-4884-59, DOI 10.1007/978-1-4471-4884-59]
[3]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[4]   Bagging predictors [J].
Breiman, L .
MACHINE LEARNING, 1996, 24 (02) :123-140
[5]  
Breiman L, 1983, CLASSIFICATION REGRE
[6]   UPPER AND LOWER PROBABILITIES INDUCED BY A MULTIVALUED MAPPING [J].
DEMPSTER, AP .
ANNALS OF MATHEMATICAL STATISTICS, 1967, 38 (02) :325-&
[7]   A new evidential K-nearest neighbor rule based on contextual discounting with partially supervised learning [J].
Denceux, Thierry ;
Kanjanatarakul, Orakanya ;
Sriboonchitta, Songsak .
INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2019, 113 :287-302
[8]   A K-NEAREST NEIGHBOR CLASSIFICATION RULE-BASED ON DEMPSTER-SHAFER THEORY [J].
DENOEUX, T .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1995, 25 (05) :804-813
[9]  
Denoeux T, 2000, IEEE SYS MAN CYBERN, P2923, DOI 10.1109/ICSMC.2000.884444
[10]  
Dua Dheeru, 2017, UCI MACHINE LEARNING