[2] Univ Electrocommun, Dept Informat & Commun Engn, Tokyo 1828585, Japan
来源:
2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10
|
2006年
关键词:
D O I:
暂无
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
The support vector machine (SVM) problem is a convex quadratic programming problem which scales with the training data size. If the training size is large, the problem cannot be solved by straighforward methods. The large-scale SVM problems are tackled by applying chunking (decomposition) technique. The quadratic programming problem involves a square matrix which is called kernel matrix is positive semi-definite. That is, the rank of the kernel matrix is less than or equal to its size. In this paper we discuss a method that can exploit the low-rank of the kernel matrix, and an interior-point method (IPM) is efficiently applied to the global (large-sized) problem. The method is based on the technique of second-order cone programming (SOCP). This method reformulates the SVM's quadratic programming problem into the second-order cone programming problem. The SOCP method is much faster than efficient softwares SVMlight and SVMTorch if the rank of the kernel matrix is small enough compared to the training set size or if the kernel matrix can be approximated by a low-rank positive semi-definite matrix.