Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion.
机构:
Virginia Polytech Inst & State Univ, Dept Business Informat Technol, Pumplin Coll Business, Blacksburg, VA 24061 USAVirginia Polytech Inst & State Univ, Dept Business Informat Technol, Pumplin Coll Business, Blacksburg, VA 24061 USA
机构:
Virginia Polytech Inst & State Univ, Dept Business Informat Technol, Pumplin Coll Business, Blacksburg, VA 24061 USAVirginia Polytech Inst & State Univ, Dept Business Informat Technol, Pumplin Coll Business, Blacksburg, VA 24061 USA