In this paper, we propose decision-tree-based multiclass support vector machines. In training, at the top node, we determine the hyperplane that separates a class (or some classes) from the others. If the separated classes include plural classes, at the node connected to the top node, we determine the hyperplane that separates the classes. We repeat this procedure until only one class remains in the separated region. This can resolve the unclassifiable regions that exist in the conventional SVMs, but a new problem arises. Namely, the division of the feature space depends on the structure of a decision tree. To maintain high generalization ability, the most separable classes should be separated at the upper nodes of a decision tree. For this, we propose four types of decision trees based on separability measured by the Euclidean distances between class centers and Mahalanobis-distance-based classifiers. We demonstrate the effectiveness of our methods over conventional SVMs using benchmark data sets.