Ramp-loss nonparallel support vector regression: Robust, sparse and scalable approximation

被引:25
|
作者
Tang, Long [1 ,2 ]
Tian, Yingjie [3 ]
Yang, Chunyan [1 ]
Pardalos, Panos M. [2 ]
机构
[1] Guangdong Univ Technol, Res Inst Exten & Innovat Method, Guangzhou 510006, Guangdong, Peoples R China
[2] Univ Florida, Dept Ind & Syst Engn, Ctr Appl Optimizat, Gainesville, FL 32611 USA
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
基金
北京市自然科学基金;
关键词
Support vector regression; Twin support vector regression; Ramp loss; CCCP; Sparseness; MACHINE; CLASSIFICATION; IMPROVEMENTS; ALGORITHM;
D O I
10.1016/j.knosys.2018.02.016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although the twin support vector regression (TSVR) has been extensively studied and diverse variants are successfully developed, when it comes to outlier-involved training set, the regression model can be wrongly driven towards the outlier points, yielding extremely poor generalization performance. To overcome such shortcoming, a Ramp-loss nonparallel support vector regression (RL-NPSVR) is proposed in this work. By adopting Ramp epsilon-insensitive loss function and another Ramp-type linear loss function, RL-NPSVR can not only explicitly filter noise and outlier suppression but also have an excellent sparseness. The non- convexity of RL-NPSVR is solved by concave-convex programming (CCCP). Because a regularized term is added into each primal problem by rigidly following the structural risk minimization (SRM) principle, CCCP actually solves a series of reconstructed convex optimizations which have the same formulation of dual problem as the standard SVR, so that computing inverse matrix is avoided and SMO-type fast algorithm can be used to accelerate the training process. Numerical experiments on various datasets have verified the effectiveness of our proposed RL-NPSVR in terms of outlier sensitivity, generalization ability, sparseness and scalability. (c) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:55 / 67
页数:13
相关论文
共 50 条
  • [1] Ramp loss nonparallel support vector machine for pattern classification
    Liu, Dalian
    Shi, Yong
    Tian, Yingjie
    KNOWLEDGE-BASED SYSTEMS, 2015, 85 : 224 - 233
  • [2] Ramp Loss Linear Programming Nonparallel Support Vector Machine
    Liu, Dalian
    Chen, Dandan
    Shi, Yong
    Tian, Yingjie
    INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE 2016 (ICCS 2016), 2016, 80 : 1745 - 1754
  • [3] Nonparallel Support Vector Ordinal Regression
    Wang, Huadong
    Shi, Yong
    Niu, Lingfeng
    Tian, Yingjie
    IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (10) : 3306 - 3317
  • [4] Smooth pinball loss nonparallel support vector machine for robust classification
    Liu, Ming-Zeng
    Shao, Yuan-Hai
    Li, Chun-Na
    Chen, Wei-Jie
    APPLIED SOFT COMPUTING, 2021, 98
  • [5] Ramp loss K-Support Vector Classification-Regression; a robust and sparse multi-class approach to the intrusion detection problem
    Bamakan, Seyed Mojtaba Hosseini
    Wang, Huadong
    Shi, Yong
    KNOWLEDGE-BASED SYSTEMS, 2017, 126 : 113 - 126
  • [6] Primal twin support vector regression and its sparse approximation
    Peng, Xinjun
    NEUROCOMPUTING, 2010, 73 (16-18) : 2846 - 2858
  • [7] Experiments in value function approximation with sparse support vector regression
    Jung, T
    Uthmann, T
    MACHINE LEARNING: ECML 2004, PROCEEDINGS, 2004, 3201 : 180 - 191
  • [8] Epsilon-nonparallel support vector regression
    Carrasco, Miguel
    Lopez, Julio
    Maldonado, Sebastian
    APPLIED INTELLIGENCE, 2019, 49 (12) : 4223 - 4236
  • [9] Epsilon-nonparallel support vector regression
    Miguel Carrasco
    Julio López
    Sebastián Maldonado
    Applied Intelligence, 2019, 49 : 4223 - 4236
  • [10] Improved robust nonparallel support vector machines
    Ali Sahleh
    Maziar Salahi
    International Journal of Data Science and Analytics, 2024, 17 : 61 - 74