A study on regularized Weighted Least Square Support Vector Classifier

被引:20
作者
Yang, Bo [1 ]
Shao, Quan-ming [1 ]
Pan, Li [1 ]
Li, Wen-bin [1 ]
机构
[1] Hunan Inst Sci & Technol, Sch Informat & Commun Engn, Yueyang 414000, Peoples R China
关键词
Weighted Least Square Support Vector; Classifier; Regularization technique; Robust estimation; Sparse classifier; REGRESSION;
D O I
10.1016/j.patrec.2018.03.002
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Least Square Support Vector Machine(LSSVM) has been widely used for solving regression and classification problems due to its simple solution. However, LSSVM has some drawbacks in practice, such as lack of robustness, loss of sparseness of its solution and inapplicability of solving classification problem in some cases. In order to find robust and sparse solution for classification problem by using Least Square methods, we propose a novel regularized Weighted Least Square Support Vector Classifier in this paper. First, we give a basic optimized Weighted Least Square Support Vector Classifier model, which can be used to obtain the best weights according to the distance between samples and classification boundary and obtain an extremely sparse solution. In order to control the sparsity of solution, we further propose a regularized Weighted Least Square Support Vector Classifier model. After theoretical analysis of regularization function, we construct 2 lands of regularization functions which meet our requirements and then design relative algorithms for optimizing weights and finding the best solution. The proposed method is evaluated on artificial datasets and several important benchmark databases in Machine Learning, and achieves the more encouraging results compared with some state-of-the-art approaches. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:48 / 55
页数:8
相关论文
共 27 条
[1]  
Atkinson AC, 2003, EXPLORING MULTIVARIA
[2]   Robust methods for heteroskedastic regression [J].
Atkinson, Anthony C. ;
Riani, Marco ;
Torti, Francesca .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2016, 104 :209-222
[3]   IP-LSSVM: A two-step sparse classifier [J].
Carvalho, B. P. R. ;
Braga, A. P. .
PATTERN RECOGNITION LETTERS, 2009, 30 (16) :1507-1515
[4]   Robust regularized extreme learning machine for regression using iteratively reweighted least squares [J].
Chen, Kai ;
Lv, Qi ;
Lu, Yao ;
Dou, Yong .
NEUROCOMPUTING, 2017, 230 :345-358
[5]   GENERALIZED S-ESTIMATORS [J].
CROUX, C ;
ROUSSEEUW, PJ ;
HOSSJER, O .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1994, 89 (428) :1271-1281
[6]   Pruning error minimization in least squares support vector machines [J].
de Kruif, BJ ;
de Vries, TJA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (03) :696-702
[7]   A class of robust and fully efficient regression estimators [J].
Gervini, D ;
Yohai, VJ .
ANNALS OF STATISTICS, 2002, 30 (02) :583-616
[8]   Detection of outliers [J].
Hadi, Ali S. ;
Imon, A. H. M. Rahmatullah ;
Werner, Mark .
WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2009, 1 (01) :57-70
[9]   l0-norm based structural sparse least square regression for feature selection [J].
Han, Jiuqi ;
Sun, Zhengya ;
Hao, Hongwei .
PATTERN RECOGNITION, 2015, 48 (12) :3927-3940
[10]  
Hoegaerts L, 2004, LECT NOTES COMPUT SC, V3316, P1247