A sparse robust model for large scale multi-class classification based on K-SVCR

被引:6
作者
Ma, Jiajun [1 ,2 ]
Zhou, Shuisheng [1 ]
Chen, Li [1 ]
Wan, Weiwei [1 ]
Zhang, Zhuan [1 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710071, Shaanxi, Peoples R China
[2] Shangluo Univ, Coll Math & Comp Applicat, Shangluo 726000, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
SVM; Multi-class classification; Outliers; K-SVCR; Sparse solution; SUPPORT VECTOR MACHINE; REGRESSION;
D O I
10.1016/j.patrec.2018.11.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
K-Support Vector Classification Regression (K-SVCR) is a multi-classification method based on "1-vs-1-vs-rest" structure, which provides better forecasting results since all the data are given full consideration while the centre of attention is a two-class partition. However, K-SVCR is not only sensitive to outliers but also time consuming. In this paper, we propose a robust least-squares version of K-SVCR (K-RLSSVCR) based on squares epsilon-insensitive ramp loss and truncated least squares loss, which partially depress the impact of outliers on the new model via its nonconvex epsilon-insensitive ramp loss and truncated least squares loss. With Concave-Convex Procedure (CCP), the solution of K-RLSSVCR is reduced to solving only a system of linear equations per iteration. For training large-scale problems, we derive an equivalent K-RLSSVCR model in primal space (Primal K-RLSSVCR) by the representer theorem, which may have a sparse solution if the corresponding kernel matrix has a low rank. We design a sparse K-RLSSVCR (K-SRLSSVCR) algorithm to achieve a sparse solution of the Primal K-RLSSVCR based on approximating the kernel matrix by a low-rank matrix. Experimental results on artificial data set and benchmark data sets show that the proposed method has better or comparable performance in classification to other related algorithms but with remarkably less training time and memory consumption, especially when used to train large-scale problems. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:16 / 23
页数:8
相关论文
共 23 条
[1]   Model selection for the LS-SVM. Application to handwriting recognition [J].
Adankon, Mathias M. ;
Cheriet, Mohamed .
PATTERN RECOGNITION, 2009, 42 (12) :3264-3270
[2]   K-SVCR.: A support vector machine for multi-class classification [J].
Angulo, C ;
Parra, X ;
Català, A .
NEUROCOMPUTING, 2003, 55 (1-2) :57-77
[3]  
[Anonymous], 2011, J MACHINE LEARNING T
[4]   Ramp loss K-Support Vector Classification-Regression; a robust and sparse multi-class approach to the intrusion detection problem [J].
Bamakan, Seyed Mojtaba Hosseini ;
Wang, Huadong ;
Shi, Yong .
KNOWLEDGE-BASED SYSTEMS, 2017, 126 :113-126
[5]  
BOTTOU L, 1994, INT C PATT RECOG, P77, DOI 10.1109/ICPR.1994.576879
[6]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[7]  
Cherkassky V, 1997, IEEE Trans Neural Netw, V8, P1564, DOI 10.1109/TNN.1997.641482
[8]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[9]   On the learnability and design of output codes for multiclass problems [J].
Crammer, K ;
Singer, Y .
MACHINE LEARNING, 2002, 47 (2-3) :201-233
[10]  
Dinuzzo F., 2012, Advances in Neural Information Processing Systems, V25, P189