Ramp-loss nonparallel support vector regression: Robust, sparse and scalable approximation

被引:25
|
作者
Tang, Long [1 ,2 ]
Tian, Yingjie [3 ]
Yang, Chunyan [1 ]
Pardalos, Panos M. [2 ]
机构
[1] Guangdong Univ Technol, Res Inst Exten & Innovat Method, Guangzhou 510006, Guangdong, Peoples R China
[2] Univ Florida, Dept Ind & Syst Engn, Ctr Appl Optimizat, Gainesville, FL 32611 USA
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
基金
北京市自然科学基金;
关键词
Support vector regression; Twin support vector regression; Ramp loss; CCCP; Sparseness; MACHINE; CLASSIFICATION; IMPROVEMENTS; ALGORITHM;
D O I
10.1016/j.knosys.2018.02.016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although the twin support vector regression (TSVR) has been extensively studied and diverse variants are successfully developed, when it comes to outlier-involved training set, the regression model can be wrongly driven towards the outlier points, yielding extremely poor generalization performance. To overcome such shortcoming, a Ramp-loss nonparallel support vector regression (RL-NPSVR) is proposed in this work. By adopting Ramp epsilon-insensitive loss function and another Ramp-type linear loss function, RL-NPSVR can not only explicitly filter noise and outlier suppression but also have an excellent sparseness. The non- convexity of RL-NPSVR is solved by concave-convex programming (CCCP). Because a regularized term is added into each primal problem by rigidly following the structural risk minimization (SRM) principle, CCCP actually solves a series of reconstructed convex optimizations which have the same formulation of dual problem as the standard SVR, so that computing inverse matrix is avoided and SMO-type fast algorithm can be used to accelerate the training process. Numerical experiments on various datasets have verified the effectiveness of our proposed RL-NPSVR in terms of outlier sensitivity, generalization ability, sparseness and scalability. (c) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:55 / 67
页数:13
相关论文
共 50 条
  • [21] Training robust support vector machine with smooth Ramp loss in the primal space
    Wang, Lei
    Jia, Huading
    Li, Jie
    NEUROCOMPUTING, 2008, 71 (13-15) : 3020 - 3025
  • [22] Robust ε-Support Vector Regression
    Lv, Yuan
    Gan, Zhong
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2014, 2014
  • [23] NPrSVM: Nonparallel sparse projection support vector machine with efficient algorithm
    Chen, Wei-Jie
    Shao, Yuan-Hai
    Li, Chun-Na
    Wang, Yu-Qing
    Liu, Ming-Zeng
    Wang, Zhen
    APPLIED SOFT COMPUTING, 2020, 90 (90)
  • [24] A method for sparse support vector regression
    Ertin, E
    Potter, LC
    Intelligent Computing: Theory and Applications III, 2005, 5803 : 24 - 30
  • [25] An improved robust and sparse twin support vector regression via linear programming
    Chen, Xiaobo
    Yang, Jian
    Chen, Long
    SOFT COMPUTING, 2014, 18 (12) : 2335 - 2348
  • [26] An improved robust and sparse twin support vector regression via linear programming
    Xiaobo Chen
    Jian Yang
    Long Chen
    Soft Computing, 2014, 18 : 2335 - 2348
  • [27] L2-loss Large-scale Linear Nonparallel Support Vector Ordinal Regression
    Shi Y.
    Li P.-J.
    Wang H.-D.
    Zidonghua Xuebao/Acta Automatica Sinica, 2019, 45 (03): : 505 - 517
  • [28] Robust twin support vector regression based on Huber loss function
    S. Balasundaram
    Subhash Chandra Prasad
    Neural Computing and Applications, 2020, 32 : 11285 - 11309
  • [29] Robust support vector regression with generic quadratic nonconvex ε-insensitive loss
    Ye, Yafen
    Gao, Junbin
    Shao, Yuanhai
    Li, Chunna
    Jin, Yan
    Hua, Xiangyu
    APPLIED MATHEMATICAL MODELLING, 2020, 82 (82) : 235 - 251
  • [30] Robust support vector quantile regression with truncated pinball loss (RSVQR)
    Hazarika, Barenya Bikash
    Gupta, Deepak
    Borah, Parashjyoti
    COMPUTATIONAL & APPLIED MATHEMATICS, 2023, 42 (06):