Decision-tree-based multiclass support vector machines

被引:0
作者
Takahashi, F [1 ]
Abe, S [1 ]
机构
[1] Kobe Univ, Grad Sch Sci & Technol, Kobe, Hyogo, Japan
来源
ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE | 2002年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose decision-tree-based multiclass support vector machines. In training, at the top node, we determine the hyperplane that separates a class (or some classes) from the others. If the separated classes include plural classes, at the node connected to the top node, we determine the hyperplane that separates the classes. We repeat this procedure until only one class remains in the separated region. This can resolve the unclassifiable regions that exist in the conventional SVMs, but a new problem arises. Namely, the division of the feature space depends on the structure of a decision tree. To maintain high generalization ability, the most separable classes should be separated at the upper nodes of a decision tree. For this, we propose four types of decision trees based on separability measured by the Euclidean distances between class centers and Mahalanobis-distance-based classifiers. We demonstrate the effectiveness of our methods over conventional SVMs using benchmark data sets.
引用
收藏
页码:1418 / 1422
页数:5
相关论文
共 3 条
[1]  
Inoue T, 2001, IEEE IJCNN, P1449, DOI 10.1109/IJCNN.2001.939575
[2]  
Kressel UHG, 1999, ADVANCES IN KERNEL METHODS, P255
[3]  
Vapnik V., 1998, STAT LEARNING THEORY, V1, P2