Which is the best multiclass SVM method? An empirical study

被引:0
作者
Duan, KB
Keerthi, SS
机构
[1] Nanyang Technol Univ, Bioinformat Res Ctr, Singapore 639798, Singapore
[2] Yahoo Res Labs, Pasadena, CA 91105 USA
来源
MULTIPLE CLASSIFIER SYSTEMS | 2005年 / 3541卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiclass SVMs are usually implemented by combining several two-class SVMs. The one-versus-all method using winner-takes-all strategy and the one-versus-one method implemented by max-wins voting are popularly used for this purpose. In this paper we give empirical evidence to show that these methods are inferior to another one-versus-one method: one that uses Platt's posterior probabilities together with the pairwise coupling idea of Hastie and Tibshirani. The evidence is particularly strong when the training dataset is sparse.
引用
收藏
页码:278 / 285
页数:8
相关论文
共 12 条
  • [1] [Anonymous], ADV NEURAL INFORM PR
  • [2] Boser B. E., 1992, Proceedings of the Fifth Annual ACM Workshop on Computational Learning Theory, P144, DOI 10.1145/130385.130401
  • [3] Dietterich TG, 1994, J ARTIF INTELL RES, V2, P263
  • [4] DUAN KB, 2003, CD0312 NAT U SING DE
  • [5] HASTIE T, 1998, ADV NEURAL INFORMATI
  • [6] A comparison of methods for multiclass support vector machines
    Hsu, CW
    Lin, CJ
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (02): : 415 - 425
  • [7] Lin H-T, 2003, NOTE PLATTS PROBABIL
  • [8] Platt JC, 2000, ADV NEUR IN, P61
  • [9] Rifkin R, 2004, J MACH LEARN RES, V5, P101
  • [10] Roth V., 2001, Pattern Recognition. 23rd DAGM Symposium. Proceedings (Lecture Notes in Computer Science Vol.2191), P246