Effective pruning algorithm for least squares support vector machine classifier

被引:2
|
作者
Yang, Xiaowei [1 ,2 ,3 ]
Lu, Jie [2 ]
Zhang, Guangquan [2 ]
机构
[1] School of Mathematical Sciences, South China University of Technology
[2] Faculty of Information Technology, Sydney University of Technology
[3] Key Laboratory of Symbol Computation and Knowledge Engineering, Jilin University
来源
Jisuanji Yanjiu yu Fazhan/Computer Research and Development | 2007年 / 44卷 / 07期
关键词
Adaptive; Chunking incremental learning; Decremental learning; Least squares support vector machine; Pruning;
D O I
10.1360/crad20070706
中图分类号
学科分类号
摘要
A well-known drawback in the least squares support vector machine (LS-SVM) is that the sparseness is lost. In this study, an effective pruning algorithm is developed to deal with this problem. To avoid solving the primal set of linear equations, the bottom to the top strategy is adopted in the proposed algorithm. During the training process of the algorithm, the chunking incremental and decremental learning procedures are used alternately. A small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using the support vector set, one can construct the final classifier. In order to test the validation of the proposed algorithm, it has been applied to five benchmarking UCI datasets. In order to show the relationships among the chunking size, the number of support vector machine, the training time, and the testing accuracy, different chunking sizes are tested. The experimental results show that the proposed algorithm can adaptively obtain the sparse solutions without almost losing generalization performance when the chunking size is equal to 2, and also its training speed is much faster than that of the sequential minimal optimization (SMO) algorithm. The proposed algorithm can also be applied to the least squares support vector regression machine as well as LS-SVM classifier.
引用
收藏
页码:1128 / 1136
页数:8
相关论文
共 19 条
  • [1] Wang L., Bo L., Liu F., Least squares hidden space support vector machines, Chinese Journal of Computers, 28, 8, pp. 1302-1307, (2005)
  • [2] Liu X., Chen Z., A fast classification algorithm of support vector machines, Journal of Computer Research and Development, 41, 8, pp. 1327-1332, (2004)
  • [3] Ye N., Sun R., Dong Y., SVM fast training algorithm research based on multi-Lagrange multiplier, Journal of Computer Research and Development, 43, 3, pp. 442-448, (2006)
  • [4] Suykens J.A.K., Vandewalle J., Least squares support vector machine classifiers, Neural Processing Letters, 9, 3, pp. 293-300, (1999)
  • [5] Suykens J.A.K., Vandewalle J., Recurrent least squares support vector machines, IEEE Trans on Circuits Systems-I, 47, 7, pp. 1109-1114, (2000)
  • [6] Suykens J.A.K., Vandewalle J., de Moor B., Optimal control by least squares support vector machines, Neural Networks, 14, 1, pp. 23-35, (2001)
  • [7] van Gestel T., Suykens J.A.K., Baestaens D.E., Financial time series prediction using least squares support vector machines within the evidence framework, IEEE Trans on Neural Networks, 12, 4, pp. 809-821, (2001)
  • [8] Suykens J.A.K., de Barbanter J., Lukas L., Weighted least squares support vector machines: Robustness and sparse approximation, Neurocomputing, 48, 1-4, pp. 85-105, (2002)
  • [9] van Gestel T., Suykens J.A.K., Baesens B., Benchmarking least squares support vector machine classifiers, Machine Learning, 54, 1, pp. 5-32, (2004)
  • [10] Chu W., Ong C.J., Keerthi S.S., An improved conjugate gradient scheme to the solution of least squares SVM, IEEE Trans on Neural Networks, 16, 2, pp. 498-501, (2005)