OPTIMAL PARTITIONING FOR CLASSIFICATION AND REGRESSION TREES

被引:147
作者
CHOU, PA [1 ]
机构
[1] STANFORD UNIV, DEPT ELECT ENGN, STANFORD, CA 94305 USA
关键词
CLUSTERING; DECISION TREES; INFORMATION DIVERGENCE; TEXT-TO-SPEECH;
D O I
10.1109/34.88569
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In designing a decision tree for classification or regression, one selects at each node a feature to be tested, and partitions the range of that feature into a small number of bins, each bin corresponding to a child of the node. When the feature's range is discrete with N unordered outcomes, the optimal partition, that is, the partition minimizing an expected loss, is usually found by an exhaustive search through all possible partitions. Since the number of possible partitions grows exponentially in N, this approach is impractical when N is larger than about 10 or 20. In this paper, we present an iterative algorithm that finds a locally optimal partition for an arbitrary loss function, in time linear in N for each iteration. The algorithm is a K-means like clustering algorithm that uses as its distance measure a generalization of Kullback's information divergence. Moreover, we prove that the globally optimal partition must satisfy a nearest neighbor condition using divergence as the distance measure. These results generalize similar results of Breiman et al. to an arbitrary number of classes or regression variables and to an arbitrary number of bins. We also provide experimental results on a text-to-speech example, and we suggest additional applications of the algorithm, including the design of variable combinations, surrogate splits, composite nodes, and decision graphs.
引用
收藏
页码:340 / 354
页数:15
相关论文
共 50 条
[21]   Optimal predictive partitioning [J].
David J. Hand ;
Wojtek J. Krzanowski ;
Martin J. Crowder .
Statistics and Computing, 2007, 17 :11-21
[22]   Optimal predictive partitioning [J].
Hand, David J. ;
Krzanowski, Wojtek J. ;
Crowder, Martin J. .
STATISTICS AND COMPUTING, 2007, 17 (01) :11-21
[23]   Using classification and regression trees to model missingness in youth BMI, height and body mass data [J].
Doggett, Amanda ;
Chaurasia, Ashok ;
Chaput, Jean-Philippe ;
Leatherdale, Scott T. .
HEALTH PROMOTION AND CHRONIC DISEASE PREVENTION IN CANADA-RESEARCH POLICY AND PRACTICE, 2023, 43 (05) :231-242
[24]   Highly adaptive regression trees [J].
Sohail Nizam ;
David Benkeser .
Evolutionary Intelligence, 2024, 17 :535-547
[25]   Highly adaptive regression trees [J].
Nizam, Sohail ;
Benkeser, David .
EVOLUTIONARY INTELLIGENCE, 2024, 17 (01) :535-547
[26]   Optimal policy trees [J].
Amram, Maxime ;
Dunn, Jack ;
Zhuo, Ying Daisy .
MACHINE LEARNING, 2022, 111 (07) :2741-2768
[27]   Optimal policy trees [J].
Maxime Amram ;
Jack Dunn ;
Ying Daisy Zhuo .
Machine Learning, 2022, 111 :2741-2768
[28]   Soft Classification Trees [J].
Villandre, Luc ;
Rich, Benjamin ;
Ciampi, Antonio .
COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2012, 41 (16-17) :3244-3258
[29]   A Comparison of Artificial Neural Network and Decision Trees with Logistic Regression as Classification Models for Breast Cancer Survival [J].
Mudunuru, Venkateswara Rao ;
Skrzypek, Leslaw A. .
INTERNATIONAL JOURNAL OF MATHEMATICAL ENGINEERING AND MANAGEMENT SCIENCES, 2020, 5 (06) :1170-1190
[30]   Optimal Interpretable Clustering Using Oblique Decision Trees [J].
Gabidolla, Magzhan ;
Carreira-Perpinan, Miguel A. .
PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, :400-410