An adaptive twin support vector regression machine based on rough and fuzzy set theories

被引:0
作者
Zhenxia Xue
Roxin Zhang
Chuandong Qin
Xiaoqing Zeng
机构
[1] North Minzu University,School of Mathematics and Information Science
[2] Northern Michigan University,Department of Mathematics and Computer Science
[3] Changsha University of Science and Technology,School of Economics and Management
来源
Neural Computing and Applications | 2020年 / 32卷
关键词
Support vector machine; -Twin support vector regression; Rough theory; Fuzzy theory;
D O I
暂无
中图分类号
学科分类号
摘要
It is known that the existing ν\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\nu$$\end{document}-twin support vector regression (ν\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\nu$$\end{document}-TWSVR) has the ability to optimize ε1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varepsilon _1$$\end{document} and ε2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varepsilon _2$$\end{document} automatically through the proper selections of the parameters ν1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\nu _1$$\end{document} and ν2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\nu _2$$\end{document}. However, since only the points near the lower-bound and upper-bound regressors are considered, it often results in overfitting problems. Furthermore, the equal penalties are applied to all samples that normally have different effects on the regressor function. In this paper, we propose an adaptive twin support vector regression (ATWSVR) machine to reduce the negative impacts of the possible outliers in ν\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\nu$$\end{document}-twin support vector regression (ν\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\nu$$\end{document}-TWSVR) by incorporating the fuzzy and rough set theories. First, two optimization models are constructed to obtain the lower and upper-bound regressors involving the use of the tools in rough and fuzzy set theories. Consequently, Theorems 1 and 2 are derived, through the application of KKT conditions and duality theory, to provide the connections between the dual optimal values and the location regions of the data points. Then, the definitions of different types of support vectors and their fuzzy proportions are given and Theorems 3 and 4 are proved to provide the bounds for the fuzzy proportions of these support vectors. Finally, the training data points located in different regions are assigned different fuzzy membership values by using iterative methods. Moreover, this approach can achieve the structural risk minimization and automatically control the fuzzy proportions of support vectors. The proposed ATWSVR is more robust for the data sets with outliers, as evidenced by the experimental results on both simulated examples as well as the benchmark real-world data sets. These results also confirm the claims made in the theorems mentioned above.
引用
收藏
页码:4709 / 4732
页数:23
相关论文
共 60 条
[1]  
Cortes C(1995)Support vector machine Mach Learn 20 273-297
[2]  
Vapnik V(2003)Texture classification using the support vector machines Pattern Recognit 36 2883-2893
[3]  
Li S(2003)Financial time series forecasting using support vector machines Neurocomputing 55 307-319
[4]  
Kwok JT(2007)Predicting the distance between antibody interface residue and antigen to recognize antigen types by support vector machine Neural Comput Appl 16 481-490
[5]  
Zhu H(2013)Demand forecasting of perishable farm products using support vector machine Int J Syst Sci 44 556-567
[6]  
Kim K(2004)A tutorial on support vector regression Stat Comput 14 199-222
[7]  
Shi Y(2010)TSVR: an efficient twin support vector machine for regression Neural Netw 23 365-372
[8]  
Zhang X(2016)An efficient implicit regularized Lagrangian twin support vector regression Appl Intell 44 831-848
[9]  
Wan J(2011)Reduced twin support vector regression Neurocomputing 74 1474-1477
[10]  
Du XF(2018)A novel least square twin support vector regression Neural Process Lett 48 1187-1200