Quadratic programming formulations for classificationand regression

被引:8
作者
Gilbert, Robin C. [1 ]
Trafalis, Theodore B. [1 ]
机构
[1] Univ Oklahoma, Sch Ind Engn, Lab Optimizat & Intelligent Syst, Norman, OK 73019 USA
关键词
classification; nonlinear regression; pattern recognition; SMO ALGORITHM; CONVERGENCE;
D O I
10.1080/10556780902752892
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
We reformulate the support vector machine approach to classification and regression problems using a different methodology than the classical 'largest margin' paradigm. From this, we are able to derive extremely simple quadratic programming problems that allow for general symbolic solutions to the classical problems of geometric classification and regression. We obtain a new class of learning machines that are also robust to the presence of small perturbations and/or corrupted or missing data in the training sets (provided that information about the amplitude of the perturbations is known approximately). A high performance framework for very large-scale classification and regression problems based on a Voronoi tessellation of the input space is also introduced in this work. Our approach has been tested on seven benchmark databases with noticeable gain in computational time in comparison with standard decomposition techniques such as SVMlight.
引用
收藏
页码:175 / 185
页数:11
相关论文
共 25 条
  • [1] AHA DW, 1991, MACHINE LEARNING, P117
  • [2] [Anonymous], 2004, KERNEL METHODS PATTE
  • [3] [Anonymous], IS T SPIES S EL IM S
  • [4] [Anonymous], 1998, UCI REPOSITORY MACHI
  • [5] [Anonymous], 1996, Iterative Methods for Sparse Linear Systems
  • [6] Bazaraa M.S., 1990, LINEAR PROGRAMMING N, DOI DOI 10.1002/0471787779
  • [7] DIETTERICH TG, 1994, ADV NEURAL INFORMATI, V6, P216
  • [8] Esmeir S., 2004, P 21 INT C MACHINE L, P257
  • [9] Joachims T, 1999, ADVANCES IN KERNEL METHODS, P169
  • [10] Convergence of a generalized SMO algorithm for SVM classifier design
    Keerthi, SS
    Gilbert, EG
    [J]. MACHINE LEARNING, 2002, 46 (1-3) : 351 - 360