Robust regression using support vector regressions

被引:64
作者
Sabzekar, Mostafa [1 ]
Hasheminejad, Seyed Mohammad Hossein [2 ]
机构
[1] Birjand Univ Technol, Dept Comp Engn, Birjand, Iran
[2] Alzahra Univ, Fac Engn, Dept Comp Engn, Tehran, Iran
关键词
Support vector regression; Robusness; Outiliers; Training noisy data; ALGORITHM;
D O I
10.1016/j.chaos.2021.110738
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Noisy data and outliers has always been one of the main challenges in regression applications. The presence of these data among training data will produce several negative effects on the generalization ability of the built model in regression. Filtering noisy data and outliers as a preprocessing step in pattern recognition has many challenges in practice because it is really hard to distinguish noisy samples from the remaining ones. Hence, robustness of an algorithm is defined as its capability to train the model in such way that it suffers less from noisy data as well as outliers. In this study, we improve the performance of support vector regression (SVR) to be more robust. SVR is sensitive to noisy data because its trained model is built by support vectors which contain a small portion of training data. The constraints inequalities in the constraints of epsilon-insensitive SVR are changed to fuzzy inequalities without any changes in its loss function. This gives more flexibility to the SVR constraints for satisfaction. Then, we solve the quadratic programming (QP) problem and compare the new method with standard support vector regression model. Experimental results with different data sets show the superiority of the proposed method especially in presence of noisy data and outliers. (C) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:6
相关论文
共 22 条
[1]   Robust twin support vector regression based on Huber loss function [J].
Balasundaram, S. ;
Prasad, Subhash Chandra .
NEURAL COMPUTING & APPLICATIONS, 2020, 32 (15) :11285-11309
[2]   A robust algorithm of support vector regression with a trimmed Huber loss function in the primal [J].
Chen, Chuanfa ;
Yan, Changqing ;
Zhao, Na ;
Guo, Bin ;
Liu, Guolin .
SOFT COMPUTING, 2017, 21 (18) :5235-5243
[3]   Least absolute deviation-based robust support vector regression [J].
Chen, Chuanfa ;
Li, Yanyan ;
Yan, Changqing ;
Guo, Jinyun ;
Liu, Guolin .
KNOWLEDGE-BASED SYSTEMS, 2017, 131 :183-194
[4]   Recursive robust least squares support vector regression based on maximum correntropy criterion [J].
Chen, Xiaobo ;
Yang, Jian ;
Liang, Jun ;
Ye, Qiaolin .
NEUROCOMPUTING, 2012, 97 :63-73
[5]   Practical selection of SVM parameters and noise estimation for SVM regression [J].
Cherkassky, V ;
Ma, YQ .
NEURAL NETWORKS, 2004, 17 (01) :113-126
[6]   Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR [J].
Cui, Wentong ;
Yan, Xuefeng .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2009, 98 (02) :130-135
[7]   On field calibration of an electronic nose for benzene estimation in an urban pollution monitoring scenario [J].
De Vito, S. ;
Massera, E. ;
Piga, A. ;
Martinotto, L. ;
Di Francia, G. .
SENSORS AND ACTUATORS B-CHEMICAL, 2008, 129 (02) :750-757
[8]   Predicting the blast-induced vibration velocity using a bagged support vector regression optimized with firefly algorithm [J].
Ding, Xiaohua ;
Hasanipanah, Mahdi ;
Rad, Hima Nikafshan ;
Zhou, Wei .
ENGINEERING WITH COMPUTERS, 2021, 37 (03) :2273-2284
[9]  
Dunn J. C., 1974, Journal of Cybernetics, V4, P1, DOI 10.1080/01969727408546062
[10]  
Herrera F, 2015, DEALING NOISY DATA D, V72