Nonlinear Regularization Path for Quadratic Loss Support Vector Machines

被引:10
作者
Karasuyama, Masayuki [1 ]
Takeuchi, Ichiro [1 ]
机构
[1] Nagoya Inst Technol, Dept Engn, Nagoya, Aichi 4668555, Japan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 10期
关键词
Parametric programming; rational approximation; support vector machines; REGRESSION; ALGORITHM; SELECTION;
D O I
10.1109/TNN.2011.2164265
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Regularization path algorithms have been proposed to deal with model selection problem in several machine learning approaches. These algorithms allow computation of the entire path of solutions for every value of regularization parameter using the fact that their solution paths have piecewise linear form. In this paper, we extend the applicability of regularization path algorithm to a class of learning machines that have quadratic loss and quadratic penalty term. This class contains several important learning machines such as squared hinge loss support vector machine (SVM) and modified Huber loss SVM. We first show that the solution paths of this class of learning machines have piecewise nonlinear form, and piecewise segments between two breakpoints are characterized by a class of rational functions. Then we develop an algorithm that can efficiently follow the piecewise nonlinear path by solving these rational equations. To solve these rational equations, we use rational approximation technique with quadratic convergence rate, and thus, our algorithm can follow the nonlinear path much more precisely than existing approaches such as predictor-corrector type nonlinear-path approximation. We show the algorithm performance on some artificial and real data sets.
引用
收藏
页码:1613 / 1625
页数:13
相关论文
共 43 条
[1]  
Allgower EL., 1993, ACTA NUMER, V2, P1, DOI [10.1017/S0962492900002336, DOI 10.1017/S0962492900002336]
[2]  
Anderson E, 1999, LAPACK users' guide, V3rd
[3]   Logistic Regression by Means of Evolutionary Radial Basis Function Neural Networks [J].
Antonio Gutierrez, Pedro ;
Hervas-Martinez, Cesar ;
Martinez-Estudillo, Francisco J. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (02) :246-263
[4]  
Bach F. R., 2005, ADV NEURAL INFORM PR, P73
[5]  
Bach FR, 2006, J MACH LEARN RES, V7, P1713
[6]  
Bartlett PL, 2007, J MACH LEARN RES, V8, P775
[7]   Recursive finite Newton algorithm for support vector regression in the primal [J].
Bo, Liefeng ;
Wang, Ling ;
Jiao, Licheng .
NEURAL COMPUTATION, 2007, 19 (04) :1082-1096
[8]  
Bottou L, 2007, LARGE SCALE KERNEL M, V3, P301, DOI DOI 10.7551/MITPRESS/7496.003.0003
[9]   RANK-ONE MODIFICATION OF SYMMETRIC EIGENPROBLEM [J].
BUNCH, JR ;
NIELSEN, CP ;
SORENSEN, DC .
NUMERISCHE MATHEMATIK, 1978, 31 (01) :31-48
[10]  
Cauwenberghs G, 2001, ADV NEUR IN, V13, P409