SMO-based pruning methods for sparse least squares support vector machines

被引:76
作者
Zeng, XY
Chen, XW [1 ]
机构
[1] Calif State Univ Northridge, Dept Elect & Comp Engn, Northridge, CA 91003 USA
[2] Univ Kansas, Dept Elect Engn & Comp Sci, Lawrence, KS 66045 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2005年 / 16卷 / 06期
关键词
least squares support vector machine; pruning; sequential minimal optimization (SMO); sparseness;
D O I
10.1109/TNN.2005.852239
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.
引用
收藏
页码:1541 / 1546
页数:6
相关论文
共 17 条
[1]  
[Anonymous], [No title captured]
[2]   An improved conjugate gradient scheme to the solution of least squares SVM [J].
Chu, W ;
Ong, CJ ;
Keerthi, SS .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (02) :498-501
[3]   Sparse on-line Gaussian processes [J].
Csató, L ;
Opper, M .
NEURAL COMPUTATION, 2002, 14 (03) :641-668
[4]   Pruning error minimization in least squares support vector machines [J].
de Kruif, BJ ;
de Vries, TJA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (03) :696-702
[5]  
DeCoste D., 2000, Proceedings. KDD-2000. Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, P345, DOI 10.1145/347090.347165
[6]   Solving systems of linear equations via gradient systems with discontinuous righthand sides: Application to LS-SVM [J].
Ferreira, LV ;
Kaszkurewicz, E ;
Bhaya, A .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (02) :501-505
[7]  
HARMERS B, 2001, 01110 ESAT SISTA
[8]  
HOEGAERTS L, 2004, P 11 INT C ICONIP CA
[9]   SMO algorithm for least-squares SVM formulations [J].
Keerthi, SS ;
Shevade, SK .
NEURAL COMPUTATION, 2003, 15 (02) :487-507
[10]  
Platt J., 1998, MICROSOFT RES