Pairing support vector algorithm for data regression

被引:8
作者
Hao, Pei-Yi [1 ]
机构
[1] Natl Kaohsiung Univ Appl Sci, Dept Informat Management, 415 Chien Kung Rd, Kaohsiung 807, Taiwan
关键词
Support vector regression; Twin support vector regression; Noise heteroscedastic model; Parametric insensitive model; Nonparallel functions; Sparse regression; MACHINE; CLASSIFICATION; IMPROVEMENTS; TRANSFORM; INPUT;
D O I
10.1016/j.neucom.2016.11.024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This study introduces a novel and efficient pairing support vector algorithm for data regression, called PSVR. The introduced PSVR approach aims at estimating an insensitive zone of flexible shape to tightly fit the training samples. Motivated by the twin support vector regression machine (TSVR), the up-and down-bound functions are estimated by solving two smaller quadratic programming problems (QPPs) rather than a single larger QPP; and thus its training speed is more rapid than the classical support vector regression machine (epsilon-SVR). The benefits of the introduced PSVR over TSVR are the enhanced sparsity and generalization capability. By introducing the concept of the insensitive zone, the number of basis functions required for determining the regression model is significantly reduced, resulting in better sparsity and more rapid prediction speed than TSVR. Besides, the introduced approach embodies the essence of statistical learning theory by adopting the principle of structural risk minimization, resulting in better generalization capability than TSVR. Moreover, the up-and down-bound functions of the insensitive zone successfully reflect the characteristics of data distribution, thus facilitating automatic estimation of the conditional mean and predictive variance simultaneously. The experimental results verify the introduced algorithm not only has more rapid learning speed, but also more preferable generalisation capability and a more rapid prediction speed. On the average, PSVR yields satisfactory regression performance with RMSE of 0.2549 on benchmark data sets, surpassing existing methods. The training speed of PSVR is over four times faster than standard e-SVR. Further, PSVR achieves significant improvements in sparsity of 59.95% over the TSVR.
引用
收藏
页码:174 / 187
页数:14
相关论文
共 37 条
[1]   A geometric approach to support vector regression [J].
Bi, JB ;
Bennett, KP .
NEUROCOMPUTING, 2003, 55 (1-2) :79-108
[2]   Heteroscedastic kernel ridge regression [J].
Cawley, GC ;
Talbot, NLC ;
Foxall, RJ ;
Dorling, SR ;
Mandic, DP .
NEUROCOMPUTING, 2004, 57 :105-124
[3]   Choosing multiple parameters for support vector machines [J].
Chapelle, O ;
Vapnik, V ;
Bousquet, O ;
Mukherjee, S .
MACHINE LEARNING, 2002, 46 (1-3) :131-159
[4]   An improved robust and sparse twin support vector regression via linear programming [J].
Chen, Xiaobo ;
Yang, Jian ;
Chen, Long .
SOFT COMPUTING, 2014, 18 (12) :2335-2348
[5]   SVMTorch: Support vector machines for large-scale regression problems [J].
Collobert, R ;
Bengio, S .
JOURNAL OF MACHINE LEARNING RESEARCH, 2001, 1 (02) :143-160
[6]  
Cortes C., 1995, Machine learning, DOI [DOI 10.1023/A:1022627411411, DOI 10.1007/BF00994018, 10.1007/BF00994018]
[7]  
Hao P. -Y., 2015, P 7 AS C INT INF DAT
[8]   Pair-v-SVR: A Novel and Efficient Pairing nu-Support Vector Regression Algorithm [J].
Hao, Pei-Yi .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (11) :2503-2515
[9]   New support vector algorithms with parametric insensitive/margin model [J].
Hao, Pei-Yi .
NEURAL NETWORKS, 2010, 23 (01) :60-73
[10]   Support vector interval regression machine for crisp input and output data [J].
Hwang, CH ;
Hong, DH ;
Seok, KH .
FUZZY SETS AND SYSTEMS, 2006, 157 (08) :1114-1125