Sparse and robust support vector machine with capped squared loss for large-scale pattern classification

被引:2
|
作者
Wang, Huajun [1 ]
Zhang, Hongwei [1 ]
Li, Wenqian [2 ]
机构
[1] Changsha Univ Sci & Technol, Dept Math & Stat, Changsha, Peoples R China
[2] Hunan Normal Univ, Coll Life Sci, Changsha, Peoples R China
基金
中国国家自然科学基金;
关键词
Capped squared loss; Fast algorithm; Support vectors; Low computational complexity; Working set;
D O I
10.1016/j.patcog.2024.110544
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support vector machine (SVM), being considered one of the most efficient tools for classification, has received widespread attention in various fields. However, its performance is hindered when dealing with large-scale pattern classification tasks due to high memory requirements and running very slow. To address this challenge, we construct a novel sparse and robust SVM based on our newly proposed capped squared loss (named as L-csl-SVM). To solve L-csl-SVM, we first focus on establishing optimality theory of L-csl-SVM via our defined proximal stationary point, which is convenient for us to efficiently characterize the L-csl support vectors of L-csl-SVM. We subsequently demonstrate that the L-csl support vectors comprise merely a minor fraction of entire training data. This observation leads us to introduce the concept of the working set. Furthermore, we design a novel subspace fast algorithm with working set (named as L-csl-ADMM) for solving L-csl-SVM, which is proven that L-csl-ADMM has both global convergence and relatively low computational complexity. Finally, numerical experiments show that L-csl-ADMM has excellent performances in terms of getting the best classification accuracy, using the shortest time and presenting the smallest numbers of support vectors when solving large-scale pattern classification problems.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Robust support vector machine with generalized quantile loss for classification and regression
    Yang, Liming
    Dong, Hongwei
    APPLIED SOFT COMPUTING, 2019, 81
  • [22] Smooth support vector machine with generalized pinball loss for Pattern Classification
    Dawrawee Makmuang
    Wachiraphong Ratiphaphongthon
    Rabian Wangkeeree
    The Journal of Supercomputing, 2023, 79 : 11684 - 11706
  • [23] Fast generalized ramp loss support vector machine for pattern classification
    Wang, Huajun
    Shao, Yuanhai
    PATTERN RECOGNITION, 2024, 146
  • [24] Smooth support vector machine with generalized pinball loss for Pattern Classification
    Makmuang, Dawrawee
    Ratiphaphongthon, Wachiraphong
    Wangkeeree, Rabian
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (11): : 11684 - 11706
  • [25] Stochastic Subgradient for Large-Scale Support Vector Machine Using the Generalized Pinball Loss Function
    Panup, Wanida
    Wangkeeree, Rabian
    SYMMETRY-BASEL, 2021, 13 (09):
  • [26] Regional Rainfall Prediction Using Support Vector Machine Classification of Large-Scale Precipitation Maps
    Hussein, Eslam
    Ghaziasgar, Mehrdad
    Thron, Christopher
    PROCEEDINGS OF 2020 23RD INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2020), 2020, : 458 - 465
  • [27] Large-Scale Structured Output Classification via Multiple Structured Support Vector Machine by Splitting
    Li, Chun-Na
    Li, Yi
    Shao, Yuan-Hai
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (02): : 2112 - 2124
  • [28] Large-scale linear nonparallel support vector machine solver
    Tian, Yingjie
    Ping, Yuan
    NEURAL NETWORKS, 2014, 50 : 166 - 174
  • [29] Least Square Support Vector Machine for Large-scale Dataset
    Khanh Nguyen
    Trung Le
    Vinh Lai
    Duy Nguyen
    Dat Tran
    Ma, Wanli
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [30] Large-scale linear nonparallel support vector machine solver
    Tian, Yingjie
    Zhang, Qin
    Ping, Yuan
    NEUROCOMPUTING, 2014, 138 : 114 - 119