Second-order SMO improves SVM online and active learning

被引:16
作者
Glasmachers, Tobias [1 ]
Igel, Christian [1 ]
机构
[1] Ruhr Univ Bochum, Inst Neuroinformat, D-44780 Bochum, Germany
关键词
D O I
10.1162/neco.2007.10-06-354
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minimal optimization (SMO). It allows efficient online and active learning. Here, this algorithm is considerably improved in speed and accuracy by replacing the working set selection in the SMO steps. A second-order working set selection strategy, which greedily aims at maximizing the progress in each single step, is incorporated.
引用
收藏
页码:374 / 382
页数:9
相关论文
共 10 条
[1]  
Bordes A, 2005, J MACH LEARN RES, V6, P1579
[2]  
CORTES C, 1995, MACH LEARN, V20, P273, DOI 10.1023/A:1022627411411
[3]  
Fan RE, 2005, J MACH LEARN RES, V6, P1889
[4]   Gradient-based adaptation of general gaussian kernels [J].
Glasmachers, T ;
Igel, C .
NEURAL COMPUTATION, 2005, 17 (10) :2099-2105
[5]  
Glasmachers T, 2006, J MACH LEARN RES, V7, P1437
[6]  
Joachims J., 1999, ADV KERNEL METHODS S
[7]  
KEERTHI S, 2006, J MACHINE LEARNING R, V8, P1
[8]  
Platt JC, 1999, ADVANCES IN KERNEL METHODS, P185
[9]  
Tsang IW, 2005, J MACH LEARN RES, V6, P363
[10]  
VISHWANATHAN SVN, 2003, P 20 INT C MACH LEAR, P760