K-nearest neighbor-based weighted multi-class twin support vector machine

被引:36
|
作者
Xu, Yitian [1 ]
机构
[1] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
关键词
TSVM; K-nearest neighbor; Weights; Multi-class classification; PATTERN-CLASSIFICATION; RECOGNITION; CLASSIFIERS;
D O I
10.1016/j.neucom.2016.04.024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Twin-KSVC, as a novel multi-class classification algorithm, aims at finding two nonparallel hyper-planes for the two focused classes of samples by solving a pair of smaller-sized quadratic programming problems (QPPs), which makes the learning speed faster than other multi-class classification algorithms. However, the local information of samples is ignored, and then each sample shares the same weight when constructing the separating hyper-planes. In fact, they have different influences on the separating hyper-planes. Inspired by the studies above, we propose a K-nearest neighbor (KNN)-based weighted multi-class twin support vector machine (KWMTSVM) in this paper. Weight matrix W is employed in the objective function to exploit the local information of intra-class. Meanwhile, both weight vectors f and h are introduced into the constraints to exploit the information of inter-class. When component f(j) = 0 or h(k) = 0, it implies that the j-th or k-th constraint is redundant. Removing these redundant constraints can effectively improve the computational speed of the classifier. Experimental results on eleven benchmark datasets and ABCD dataset demonstrate the validity of our proposed algorithm. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:430 / 438
页数:9
相关论文
共 50 条
  • [31] Multi-class classification using kernel density estimation on K-nearest
    Tang, Xiaofeng
    Xu, Aiqiang
    ELECTRONICS LETTERS, 2016, 52 (08) : 600 - 601
  • [32] Multi-class classification algorithm based on Support Vector Machine
    Yang Kuihe
    Yuan Min
    7TH INTERNATIONAL CONFERENCE ON MEASUREMENT AND CONTROL OF GRANULAR MATERIALS, PROCEEDINGS, 2006, : 322 - 325
  • [33] COMPARATIVE ANALYSIS OF K-NEAREST NEIGHBOR AND SUPPORT VECTOR MACHINE IN CLASSIFICATION OF COVID 19 DISEASE IN MAKASSAR CITY
    Abuspin, Muammar ashari
    Herdiani, Erna tri
    Tinungki, Georgina maria
    COMMUNICATIONS IN MATHEMATICAL BIOLOGY AND NEUROSCIENCE, 2024,
  • [34] The best separating decision tree twin support vector machine for multi-class classification
    Shao, Yuan-Hai
    Chen, Wei-Jie
    Huang, Wen-Biao
    Yang, Zhi-Min
    Deng, Nai-Yang
    FIRST INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND QUANTITATIVE MANAGEMENT, 2013, 17 : 1032 - 1038
  • [35] Self-training algorithm for hyperspectral imagery classification based on mixed measurement k-nearest neighbor and support vector machine
    Ge, Haimiao
    Pan, Haizhu
    Wang, Liguo
    Liu, Moqi
    Li, Cheng
    JOURNAL OF APPLIED REMOTE SENSING, 2021, 15 (04)
  • [36] A hybrid text classification approach with low dependency on parameter by integrating K-nearest neighbor and support vector machine
    Wan, Chin Heng
    Lee, Lam Hong
    Rajkumar, Rajprasad
    Isa, Dino
    EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (15) : 11880 - 11888
  • [37] Least squares recursive projection twin support vector machine for multi-class classification
    Yang, Zhi-Min
    Wu, He-Ji
    Li, Chun-Na
    Shao, Yuan-Hai
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2016, 7 (03) : 411 - 426
  • [38] WITHIN-CLASS PENALTY BASED MULTI-CLASS SUPPORT VECTOR MACHINE
    Shi, Xiaoshuang
    Guo, Zhenhua
    Yang, Yujiu
    Yang, Lin
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 2746 - 2750
  • [39] Least squares recursive projection twin support vector machine for multi-class classification
    Zhi-Min Yang
    He-Ji Wu
    Chun-Na Li
    Yuan-Hai Shao
    International Journal of Machine Learning and Cybernetics, 2016, 7 : 411 - 426
  • [40] A MULTI-CLASS SUPPORT VECTOR MACHINE: THEORY AND MODEL
    Sun, Minghe
    INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY & DECISION MAKING, 2013, 12 (06) : 1175 - 1199