Adjusted support vector machines based on a new loss function

被引:16
作者
Wang, Shuchun [2 ]
Jiang, Wei [1 ]
Tsui, Kwok-Leung [3 ]
机构
[1] Stevens Inst Technol, Dept Syst Engn & Engn Management, Hoboken, NJ 07030 USA
[2] Golden Arc Capital Inc, New York, NY USA
[3] Georgia Inst Technol, Sch Ind & Syst Engn, Atlanta, GA 30332 USA
基金
美国国家科学基金会;
关键词
Classification error; Cross validation; Dispersion; Sampling bias; CLASSIFICATION;
D O I
10.1007/s10479-008-0495-y
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion.
引用
收藏
页码:83 / 101
页数:19
相关论文
共 50 条
[41]   How good are support vector machines? [J].
Raudys, S .
NEURAL NETWORKS, 2000, 13 (01) :17-19
[42]   The consistency of multicategory support vector machines [J].
Chen, DR ;
Xiang, DH .
ADVANCES IN COMPUTATIONAL MATHEMATICS, 2006, 24 (1-4) :155-169
[43]   Support vector machines for spam categorization [J].
Drucker, H ;
Wu, DH ;
Vapnik, VN .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05) :1048-1054
[44]   On qualitative robustness of support vector machines [J].
Hable, Robert ;
Christmann, Andreas .
JOURNAL OF MULTIVARIATE ANALYSIS, 2011, 102 (06) :993-1007
[45]   Sparseness Methods of Support Vector Machines [J].
Li Junfei ;
Zhang Yiqin .
SIXTH INTERNATIONAL CONFERENCE ON ELECTROMECHANICAL CONTROL TECHNOLOGY AND TRANSPORTATION (ICECTT 2021), 2022, 12081
[46]   Support Vector Machines for Quasar Selection [J].
Peng, Nanbo ;
Zhang, Yanxia ;
Zhao, Yongheng .
SOFTWARE AND CYBERINFRASTRUCTURE FOR ASTRONOMY, 2010, 7740
[47]   Active learning with support vector machines [J].
Kremer, Jan ;
Pedersen, Kim Steenstrup ;
Igel, Christian .
WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2014, 4 (04) :313-326
[48]   On Subsampling Procedures for Support Vector Machines [J].
Barcenas, Roberto ;
Gonzalez-Lima, Maria ;
Ortega, Joaquin ;
Quiroz, Adolfo .
MATHEMATICS, 2022, 10 (20)
[49]   Scalable Multilevel Support Vector Machines [J].
Razzaghi, Talayeh ;
Safro, Ilya .
INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE, ICCS 2015 COMPUTATIONAL SCIENCE AT THE GATES OF NATURE, 2015, 51 :2683-2687
[50]   Multicategory classification by support vector machines [J].
Bredensteiner, EJ ;
Bennett, KP .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 1999, 12 (1-3) :53-79