A Dual-Based Pruning Method for the Least-Squares Support Vector Machine

被引:0
作者
Xia, Xiao-Lei [1 ]
Zhou, Shang-Ming [2 ]
Ouyang, Mingxing [1 ]
Xiang, Dafang [1 ]
Zhang, Zhijun [1 ]
Zhou, Zexiang [1 ]
机构
[1] Guangdong Songshan Polytech Coll, Sch Elect Engn, Shaoguan, Peoples R China
[2] Univ Plymouth, Ctr Hlth Technol, Plymouth PL4 8AA, Devon, England
基金
英国医学研究理事会;
关键词
least-squares support vector machine; sparsity; pruning methods; dual form; the; method of Lagrange multipliers; ERROR MINIMIZATION; SPARSE LSSVM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The least-squares support vector machine (LS-SVM) is generally parameterized by a large number of support vectors, which slows down the speed of classification. This paper proposes to search for and prune two types of support vectors. The first type is the potential outliers, each of which is misclassified by the model trained on the other samples. The second type is the sample whose removal causes the least perturbation to the dual objective function. Without implicitly implementing the training procedure, the LS-SVM model pertaining to omission of a training sample is derived analytically from the LS-SVM trained on the whole training set. The derivation reduces the computational cost of pruning a sample, which makes the major technical contribution of this paper. Experimental results on six UCI datasets show that, compared with classical pruning methods, the proposed algorithm can enhance the sparsity of the LS-SVM significantly, while maintaining satisfactory generalization performances.
引用
收藏
页码:10377 / 10383
页数:7
相关论文
共 20 条
[1]  
Cauwenberghs G, 2001, ADV NEUR IN, V13, P409
[2]   Fast exact leave-one-out cross-validation of sparse least-squares support vector machines [J].
Cawley, GC ;
Talbot, NLC .
NEURAL NETWORKS, 2004, 17 (10) :1467-1475
[3]   ORTHOGONAL LEAST-SQUARES LEARNING ALGORITHM FOR RADIAL BASIS FUNCTION NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
GRANT, PM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (02) :302-309
[4]  
Cherkassky V, 1997, IEEE Trans Neural Netw, V8, P1564, DOI 10.1109/TNN.1997.641482
[5]   An improved conjugate gradient scheme to the solution of least squares SVM [J].
Chu, W ;
Ong, CJ ;
Keerthi, SS .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (02) :498-501
[6]   Pruning error minimization in least squares support vector machines [J].
de Kruif, BJ ;
de Vries, TJA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (03) :696-702
[7]  
Fletcher R., 1987, PRACTICAL METHODS OP
[8]   Fast sparse approximation for least squares support vector machine [J].
Jiao, Licheng ;
Bo, Liefeng ;
Wang, Ling .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (03) :685-697
[9]   Comments on "Pruning error minimization in least squares support vector machines" [J].
Kuh, Anthony ;
De Wilde, Philippe .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (02) :606-609
[10]   Very Sparse LSSVM Reductions for Large-Scale Data [J].
Mall, Raghvendra ;
Suykens, Johan A. K. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (05) :1086-1097