Probabilistic Ensemble of Deep Information Networks

被引:5
作者
Franzese, Giulio [1 ]
Visintin, Monica [1 ]
机构
[1] Politecn Torino, Elect & Telecommun, I-10100 Turin, Italy
关键词
information theory; information bottleneck; classifier; decision tree; ensemble; BOTTLENECK; CAPACITY;
D O I
10.3390/e22010100
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We describe a classifier made of an ensemble of decision trees, designed using information theory concepts. In contrast to algorithms C4.5 or ID3, the tree is built from the leaves instead of the root. Each tree is made of nodes trained independently of the others, to minimize a local cost function (information bottleneck). The trained tree outputs the estimated probabilities of the classes given the input datum, and the outputs of many trees are combined to decide the class. We show that the system is able to provide results comparable to those of the tree classifier in terms of accuracy, while it shows many advantages in terms of modularity, reduced complexity, and memory requirements.
引用
收藏
页数:17
相关论文
共 28 条
[21]  
Norouzi M, 2015, ADV NEUR IN, V28
[22]  
Quinlan J. R., 1986, Machine Learning, V1, P81, DOI 10.1007/BF00116251
[23]  
Quinlan J.R., 2014, C4.5: Programs for Machine Learning
[24]   Improved use of continuous attributes in C4.5 [J].
Quinlan, JR .
JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 1996, 4 :77-90
[25]  
Rokach L.M.O, 2008, DATA MINING DECISION, V69
[26]   Detection of Chronic Kidney Disease and Selecting Important Predictive Attributes [J].
Salekin, Asif ;
Stankovic, John .
2016 IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI), 2016, :262-270
[27]  
Slonim N, 2000, ADV NEUR IN, V12, P617
[28]   Information Bottleneck Approach to Predictive Inference [J].
Still, Susanne .
ENTROPY, 2014, 16 (02) :968-989