Parallel randomized sampling for support vector machine (SVM) and support vector regression (SVR)

被引:19
作者
Lu, Yumao [1 ]
Roychowdhury, Vwani [2 ]
机构
[1] Yahoo Inc, Sunnyvale, CA 94089 USA
[2] Univ Calif Los Angeles, Dept Elect Engn, Los Angeles, CA 90024 USA
关键词
randomized sampling; support vector machine; support vector regression; parallel algorithm;
D O I
10.1007/s10115-007-0082-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A parallel randomized support vector machine (PRSVM) and a parallel randomized support vector regression (PRSVR) algorithm based on a randomized sampling technique are proposed in this paper. The proposed PRSVM and PRSVR have four major advantages over previous methods. (1) We prove that the proposed algorithms achieve an average convergence rate that is so far the fastest bounded convergence rate, among all SVM decomposition training algorithms to the best of our knowledge. The fast average convergence bound is achieved by a unique priority based sampling mechanism. (2) Unlike previous work (Provably fast training algorithm for support vector machines, 2001) the proposed algorithms work for general linear-nonseparable SVM and general non-linear SVR problems. This improvement is achieved by modeling new LP-type problems based on Karush-Kuhn-Tucker optimality conditions. (3) The proposed algorithms are the first parallel version of randomized sampling algorithms for SVM and SVR. Both the analytical convergence bound and the numerical results in a real application show that the proposed algorithm has good scalability. (4) We present demonstrations of the algorithms based on both synthetic data and data obtained from a real word application. Performance comparisons with SVMlight show that the proposed algorithms may be efficiently implemented.
引用
收藏
页码:233 / 247
页数:15
相关论文
共 50 条
  • [31] Multiple Submodels Parallel Support Vector Machine on Spark
    Liu, Chang
    Wu, Bin
    Yang, Yi
    Guo, Zhihong
    2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2016, : 945 - 950
  • [32] Successive overrelaxation for support vector regression
    Quan, Y
    Yang, J
    Ye, CZ
    ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, 2003, 2639 : 648 - 651
  • [33] PTSVRs: Regression models via projection twin support vector machine
    Peng, Xinjun
    Chen, De
    INFORMATION SCIENCES, 2018, 435 : 1 - 14
  • [34] V-SVR plus : Support Vector Regression With Variational Privileged Information
    Shu, Yangyang
    Li, Qian
    Xu, Chang
    Liu, Shaowu
    Xu, Guandong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 24 : 876 - 889
  • [35] Relaxed support vector regression
    Panagopoulos, Orestis P.
    Xanthopoulos, Petros
    Razzaghi, Talayeh
    Seref, Onur
    ANNALS OF OPERATIONS RESEARCH, 2019, 276 (1-2) : 191 - 210
  • [36] Relaxed support vector regression
    Orestis P. Panagopoulos
    Petros Xanthopoulos
    Talayeh Razzaghi
    Onur Şeref
    Annals of Operations Research, 2019, 276 : 191 - 210
  • [37] On Lagrangian support vector regression
    Balasundaram, S.
    Kapil
    EXPERT SYSTEMS WITH APPLICATIONS, 2010, 37 (12) : 8784 - 8792
  • [38] Field Support Vector Regression
    Jiang, Haochuan
    Huang, Kaizhu
    Zhang, Rui
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 699 - 708
  • [39] Convex support vector regression
    Liao, Zhiqiang
    Dai, Sheng
    Kuosmanen, Timo
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2024, 313 (03) : 858 - 870
  • [40] Least squares support vector machine regression with additional constrains
    Ye Hong
    Sun, Bing-Yu
    Wang, Ru Jing
    PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE: 50 YEARS' ACHIEVEMENTS, FUTURE DIRECTIONS AND SOCIAL IMPACTS, 2006, : 682 - 684