ConCave-Convex procedure for support vector machines with Huber loss for text classification

被引:0
作者
Borah, Parashjyoti [1 ]
Gupta, Deepak [2 ]
Hazarika, Barenya Bikash [3 ]
机构
[1] Indian Inst Informat Technol Guwahati Bongora, Dept Comp Sci & Engn, Gauhati 781015, Assam, India
[2] Motilal Nehru Natl Inst Technol Allahabad, Dept Comp Sci & Engn, Prayagraj 211004, Uttar Pradesh, India
[3] Assam Town Univ, Fac Comp Technol, Sankar Madhab Path,Gandhinagar, Gauhati 781026, Assam, India
关键词
Support vector machine; Hinge loss; ConCave-Convex procedure; Ramp loss function; Huber loss functions; REGRESSION; CLASSIFIERS; ALGORITHM;
D O I
10.1016/j.compeleceng.2024.109925
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The classical support vector machine (SVM) adopts the linear Hinge loss whereas the least squares SVM (LS-SVM) employs the quadratically growing least squares loss function. The robust Ramp loss function is employed in Ramp loss SVM (RSVM) that truncates the Hinge loss function and becomes flat a specified point afterwards, thus, increases robustness to outliers. Recently proposed SVM with pinball loss (pin-SVM) utilizes pinball loss function that maximizes the margin between the class hyperplanes based on quantile distance. Huber loss function is the generalization of linear Hinge loss and quadratic loss. Huber loss solves sensitivity issues of least squares loss to noise and outlier. In this work, we employ the robust Huber loss function for SVM classification for improved generalization performance. The cost function of the proposed approach consists of one convex and one non-convex part, which might sometimes provide local optimum solution instead of a global optimum. We suggest a ConCave-Convex Procedure (CCCP) to resolve this issue. Additionally, the proximal cost is scaled for each class sample based on their class size to reduce the effect of the class imbalance problem. Thus, it can be claimed that the proposed approach incorporates class imbalance learning as well. Extensive experimental analysis establishes efficacy of the proposed method. Furthermore, a sequential minimal optimization (SMO) procedure for high dimensional HSVM is proposed and its performance is tested on two text classification datasets.
引用
收藏
页数:22
相关论文
共 50 条
  • [41] Solving imbalanced classification problems with support vector machines
    Lessmann, S
    IC-AI '04 & MLMTA'04 , VOL 1 AND 2, PROCEEDINGS, 2004, : 214 - 220
  • [42] Classification of Nucleotide Sequences Using Support Vector Machines
    Seo, Tae-Kun
    JOURNAL OF MOLECULAR EVOLUTION, 2010, 71 (04) : 250 - 267
  • [43] Stochastic Optimization Algorithms for Support Vector Machines Classification
    Bartkute-Norkuniene, Vaida
    INFORMATICA, 2009, 20 (02) : 173 - 186
  • [44] Sphere Support Vector Machines for large classification tasks
    Strack, Robert
    Kecman, Vojislav
    Strack, Beata
    Li, Qi
    NEUROCOMPUTING, 2013, 101 : 59 - 67
  • [45] Minimal Complexity Support Vector Machines for Pattern Classification
    Abe, Shigeo
    COMPUTERS, 2020, 9 (04) : 1 - 27
  • [46] Support vector machines for hyperspectral remote sensing classification
    Gualtieri, JA
    Cromp, RF
    ADVANCES IN COMPUTER-ASSISTED RECOGNITION, 1999, 3584 : 221 - 232
  • [47] Multi-task Support Vector Machine Classifier with Generalized Huber Loss
    Liu, Qi
    Zhu, Wenxin
    Dai, Zhengming
    Ma, Zhihong
    JOURNAL OF CLASSIFICATION, 2025, 42 (01) : 221 - 252
  • [48] Robust truncated hinge loss support vector machines
    Wu, Yichao
    Liu, Yufeng
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2007, 102 (479) : 974 - 983
  • [49] Clifford Support Vector Machines for Classification, Regression, and Recurrence
    Bayro-Corrochano, Eduardo Jose
    Arana-Daniel, Nancy
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (11): : 1731 - 1746
  • [50] Support vector machines based on convex risk functions and general norms
    Jun-ya Gotoh
    Stan Uryasev
    Annals of Operations Research, 2017, 249 : 301 - 328