Least squares KNN-based weighted multiclass twin SVM

被引:35
作者
Tanveer, M. [1 ]
Sharma, A. [1 ]
Suganthan, P. N. [2 ]
机构
[1] Indian Inst Technol Indore, Discipline Math, Indore 453552, India
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore, Singapore
关键词
Twin-KSVC; Weight matrix; Imbalance data; SUPPORT VECTOR MACHINE; CLASSIFICATION;
D O I
10.1016/j.neucom.2020.02.132
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
K-nearest neighbor (KNN) based weighted multi-class twin support vector machines (KWMTSVM) is a novel multi-class classification method. In this paper, we propose a novel least squares version of KWMTSVM called LS-KWMTSVM by replacing the inequality constraints with equality constraints and minimized the slack variables using squares of 2-norm instead of conventional 1-norm. This simple mod-ification leads to a very fast algorithm with much better results. The modified primal problems in the pro-posed LS-KWMTSVM solves only two systems of linear equations whereas two quadratic programming problems (QPPs) need to solve in KWMTSVM. The proposed LS-KWMTSVM, same as KWMTSVM, employed the weight matrix in the objective function to exploit the local information of the training sam-ples. To exploit the inter class information, we use weight vectors in the constraints of the proposed LS-KWMTSVM. If any component of vectors is zero then the corresponding constraint is redundant and thus we can avoid it. Elimination of redundant constraints and solving a system of linear equations instead of QPPs makes the proposed LS-KWMTSVM more robust and faster than KWMTSVM. The proposed LS-KWMTSVM, commensurate as the KWMTSVM, all the training data points into a "1-versus-1-versus-rest" structure, and thus our LS-KWMTSVM generate ternary output {-1, 0,+1} which helps to deal with imbalance datasets. Numerical experiments on several UCI and KEEL imbalance datasets(with high imbalance ratio) clearly indicate that the proposed LS-KWMTSVM has better classification accuracy com-pared with other baseline methods but with remarkably less computational time. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:454 / 464
页数:11
相关论文
共 46 条
[1]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[2]   K-SVCR.: A support vector machine for multi-class classification [J].
Angulo, C ;
Parra, X ;
Català, A .
NEUROCOMPUTING, 2003, 55 (1-2) :57-77
[3]  
[Anonymous], 1992, UCI REPOSITORY MACHI
[4]  
BOTTOU L, 1994, INT C PATT RECOG, P77, DOI 10.1109/ICPR.1994.576879
[5]   Massive data discrimination via linear support vector machines [J].
Bradley, PS ;
Mangasarian, OL .
OPTIMIZATION METHODS & SOFTWARE, 2000, 13 (01) :1-10
[6]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[7]   NEAREST NEIGHBOR PATTERN CLASSIFICATION [J].
COVER, TM ;
HART, PE .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1967, 13 (01) :21-+
[8]  
Datt S., 2019, IEEE T NEURAL NETWOR, V30, P1
[9]   Near-Bayesian Support Vector Machines for imbalanced data classification with equal or unequal misclassification costs [J].
Datta, Shounak ;
Das, Swagatam .
NEURAL NETWORKS, 2015, 70 :39-52
[10]  
Demsar J, 2006, J MACH LEARN RES, V7, P1