Rigorous proof of termination of SMO algorithm for support vector machines

被引:33
作者
Takahashi, N [1 ]
Nishi, T [1 ]
机构
[1] Kyushu Univ, Dept Comp Sci & Commun Engn, Fukuoka 8128581, Japan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2005年 / 16卷 / 03期
关键词
support vector machines (SVMs); sequential minimal optimization (SMO) algorithm; convergence; termination;
D O I
10.1109/TNN.2005.844857
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sequential minimal optimization (SMO) algorithm is one of the simplest decomposition methods for learning of support vector machines (SVMs). Keerthi and Gilbert have recently studied the convergence property of SMO algorithm and given a proof that SMO algorithm always stops within a finite number of iterations. In this letter, we point out the incompleteness of their proof and give a more rigorous proof.
引用
收藏
页码:774 / 776
页数:3
相关论文
共 9 条
[1]  
[Anonymous], 1990, SUPPORT VECTOR LEARN
[2]  
Bazaraa M. S., 2013, NONLINEAR PROGRAMMIN
[3]  
Joachims T., 1998, ADV KERNEL METHODS S
[4]   Convergence of a generalized SMO algorithm for SVM classifier design [J].
Keerthi, SS ;
Gilbert, EG .
MACHINE LEARNING, 2002, 46 (1-3) :351-360
[5]   Improvements to Platt's SMO algorithm for SVM classifier design [J].
Keerthi, SS ;
Shevade, SK ;
Bhattacharyya, C ;
Murthy, KRK .
NEURAL COMPUTATION, 2001, 13 (03) :637-649
[6]   A formal analysis of stopping criteria of decomposition methods for support vector machines [J].
Lin, CJ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (05) :1045-1052
[7]   Asymptotic convergence of an SMO algorithm without any assumptions [J].
Lin, CJ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (01) :248-250
[8]   On the convergence of the decomposition method for support vector machines [J].
Lin, CJ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (06) :1288-1298
[9]  
Vapnik V., 1998, STAT LEARNING THEORY, V1, P2