A new fuzzy support vector machine with pinball loss

被引:0
作者
Verma R.N. [1 ]
Deo R. [1 ]
Srivastava R. [1 ,2 ]
Subbarao N. [1 ]
Singh G.P. [1 ]
机构
[1] School of Computational and Integrative Sciences, Jawaharlal Nehru University, New Delhi
[2] Center for Computational Natural Sciences and Bioinformatics, International Institute of Information Technology, Hyderabad
来源
Discover Artificial Intelligence | 2023年 / 3卷 / 01期
关键词
Fuzzy support vector machine; Hinge loss; Pinball loss; Support vector machine;
D O I
10.1007/s44163-023-00057-5
中图分类号
学科分类号
摘要
The fuzzy support vector machine (FSVM) assigns each sample a fuzzy membership value based on its relevance, making it less sensitive to noise or outliers in the data. Although FSVM has had some success in avoiding the negative effects of noise, it uses hinge loss, which maximizes the shortest distance between two classes and is ineffective in dealing with feature noise near the decision boundary. Furthermore, whereas FSVM concentrates on misclassification errors, it neglects to consider the critical within-class scatter minimization. We present a Fuzzy support vector machine with pinball loss (FPin-SVM), which is a fuzzy extension of a reformulation of a recently proposed support vector machine with pinball loss (Pin-SVM) with several significant improvements, to improve the performance of FSVM. First, because we used the squared L2- norm of errors variables instead of the L1 norm, our FPin-SVM is a strongly convex minimization problem; second, to speed up the training procedure, solutions of the proposed FPin-SVM, as an unconstrained minimization problem, are obtained using the functional iterative and Newton methods. Third, it is proposed to solve the minimization problem directly in primal. Unlike FSVM and Pin-SVM, our FPin-SVM does not require a toolbox for optimization. We dig deeper into the features of FPin-SVM, such as noise insensitivity and within-class scatter minimization. We conducted experiments on synthetic and real-world datasets with various sounds to validate the usefulness of the suggested approach. Compared to the SVM, FSVM, and Pin-SVM, the presented approaches demonstrate equivalent or superior generalization performance in less training time. © The Author(s) 2023.
引用
收藏
相关论文
共 39 条
[1]  
Abe S., Support vector machines for pattern classification, (2005)
[2]  
Balasundaram S., Gupta D., Prasad S.C., A new approach for training lagrangian twin support vector machine via unconstrained convex minimization, Appl Intell, 46, pp. 124-134, (2017)
[3]  
Balasundaram S., Tanveer M., On proximal bilateral-weighted fuzzy support vector machine classifiers, IJAIP, 4, pp. 199-210, (2013)
[4]  
Bi J., Zhang T., Support vector classification with input data uncertainty, Adv Neural Inf Process Syst, 17, pp. 161-168, (2005)
[5]  
Cortes C., Vapnik V., Support vector networks, Mach Learn, 20, 3, pp. 273-297, (1995)
[6]  
Cristianini N., Shawe-Taylor J., An introduction to support vector machines and other kernel-based learning method, (2000)
[7]  
Demsar J., Statistical comparisons of classifiers over multiple data sets, J Mach Learn Res, 7, pp. 1-30, (2006)
[8]  
Fung G., Mangasarian O.L., Finite newton method for lagrangian support vector machine, Neurocomputing, 55, pp. 39-55, (2003)
[9]  
Gestel T.V., Suykens J.A.K., Lanckriet G., Lambrechts A., Moor BDe and Vanderwalle J., Bayesian framework for least squares support vector machine classifiers, gaussian processes and kernel fisher discriminant analysis, Neural Comput, 15, 5, pp. 1115-1148, (2002)
[10]  
Gupta U., Gupta D., An improved regularization based lagrangian asymmetric ν-twin support vector regression using pinball loss function, Appl Intell, 49, 10, pp. 3606-3627, (2019)