An improved Twin-KSVC with its applications

被引:6
作者
Ai, Qing [1 ,2 ]
Wang, Anna [1 ]
Wang, Yang [1 ]
Sun, Haijing [1 ]
机构
[1] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Liaoning, Peoples R China
[2] Univ Sci & Technol Liaoning, Sch Software, Anshan 114051, Liaoning, Peoples R China
关键词
Twin-KSVC; K-SVCR; Multi-class classification; One-versus-one-versus-rest structure; SUPPORT VECTOR MACHINE; MARGIN; SVM;
D O I
10.1007/s00521-018-3487-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Twin-KSVC (Xu et al. in Cognit Comput 5(4):580-588, 2013) is a novel multi-classifier, which is an extension of K-SVCR (Angulo et al. in Neurocomputing 55(12):57-77, 2003). Compared with K-SVCR, the Twin-KSVC has higher training speed. However, there are some drawbacks in classical Twin-KSVC. (a) Each pair of sub-classifiers in Twin-KSVC only implements empirical risk minimization, which makes generalization performance reduced. (b) Each pair of sub-classifiers in Twin-KSVC needs to calculate large-scale inverse matrices, which is intractable or even impossible in practical applications. (c) For the large-scale datasets, the classical Twin-KSVC doesn't offer an appropriate training algorithm. (d) For nonlinear case, the classical Twin-KSVC has to construct additional primal problems based on the approximate kernel-generated surface. For the drawbacks of Twin-KSVC, we propose an improved version in this paper, called ITKSVC. First of all, we introduce regularization terms into each pair of sub-classifiers in Twin-KSVC, which makes each pair of sub-classifiers implement structural risk minimization. Further, we theoretically deduce the dual problems of each pair of sub-classifiers, which makes ITKSVC avoid calculating large-scale inverse matrices. In addition, to improve training speed of each pair of sub-classifiers in ITKSVC for the large-scale datasets, successive overrelaxation method is applied. Finally, the dual problems of each pair of sub-classifiers in ITKSVC can directly apply the kernel trick for nonlinear cases. The experimental results on several benchmark datasets indicate that, compared with Twin-KSVC, the proposed ITKSVC has better classification performance for large-scale datasets.
引用
收藏
页码:6615 / 6624
页数:10
相关论文
共 34 条
  • [1] K-SVCR.: A support vector machine for multi-class classification
    Angulo, C
    Parra, X
    Català, A
    [J]. NEUROCOMPUTING, 2003, 55 (1-2) : 57 - 77
  • [2] [Anonymous], PAIRWISE CLASSIFICAT
  • [3] [Anonymous], 2000, NATURE STAT LEARNING
  • [4] Bennett KP, 1999, COMBINING SUPPORT VE
  • [5] Chen SC, 2017, IOP EXPAND PHYS, P1, DOI 10.1088/978-0-7503-1674-3
  • [6] MLTSVM: A novel twin support vector machine to multi-label learning
    Chen, Wei-Jie
    Shao, Yuan -Hai
    Li, Chun-Na
    Deng, Nai-Yang
    [J]. PATTERN RECOGNITION, 2016, 52 : 61 - 74
  • [7] Lung nodule classification using artificial crawlers, directional texture and support vector machine
    Froz, Bruno Rodrigues
    de Carvalho Filho, Antonio Oseas
    Silva, Aristofanes Correa
    de Paiva, Anselmo Cardoso
    Nunes, Rodolfo Acatauassu
    Gattass, Marcelo
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2017, 69 : 176 - 188
  • [8] Face Recognition with Occlusion Using a Wireframe Model and Support Vector Machine
    Garcia, E.
    Escamilla, E.
    Nakano, M.
    Perez, H.
    [J]. IEEE LATIN AMERICA TRANSACTIONS, 2017, 15 (10) : 1960 - 1966
  • [9] Twin support vector machines for pattern classification
    Jayadeva
    Khemchandani, R.
    Chandra, Suresh
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2007, 29 (05) : 905 - 910
  • [10] Wind speed prediction using reduced support vector machines with feature selection
    Kong, Xiaobing
    Liu, Xiangjie
    Shi, Ruifeng
    Lee, Kwang Y.
    [J]. NEUROCOMPUTING, 2015, 169 : 449 - 456