A sparse robust model for large scale multi-class classification based on K-SVCR

被引:5
|
作者
Ma, Jiajun [1 ,2 ]
Zhou, Shuisheng [1 ]
Chen, Li [1 ]
Wan, Weiwei [1 ]
Zhang, Zhuan [1 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710071, Shaanxi, Peoples R China
[2] Shangluo Univ, Coll Math & Comp Applicat, Shangluo 726000, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
SVM; Multi-class classification; Outliers; K-SVCR; Sparse solution; SUPPORT VECTOR MACHINE; REGRESSION;
D O I
10.1016/j.patrec.2018.11.012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
K-Support Vector Classification Regression (K-SVCR) is a multi-classification method based on "1-vs-1-vs-rest" structure, which provides better forecasting results since all the data are given full consideration while the centre of attention is a two-class partition. However, K-SVCR is not only sensitive to outliers but also time consuming. In this paper, we propose a robust least-squares version of K-SVCR (K-RLSSVCR) based on squares epsilon-insensitive ramp loss and truncated least squares loss, which partially depress the impact of outliers on the new model via its nonconvex epsilon-insensitive ramp loss and truncated least squares loss. With Concave-Convex Procedure (CCP), the solution of K-RLSSVCR is reduced to solving only a system of linear equations per iteration. For training large-scale problems, we derive an equivalent K-RLSSVCR model in primal space (Primal K-RLSSVCR) by the representer theorem, which may have a sparse solution if the corresponding kernel matrix has a low rank. We design a sparse K-RLSSVCR (K-SRLSSVCR) algorithm to achieve a sparse solution of the Primal K-RLSSVCR based on approximating the kernel matrix by a low-rank matrix. Experimental results on artificial data set and benchmark data sets show that the proposed method has better or comparable performance in classification to other related algorithms but with remarkably less training time and memory consumption, especially when used to train large-scale problems. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:16 / 23
页数:8
相关论文
共 50 条
  • [1] SPARSE LEAST SQUARES K-SVCR MULTI-CLASS CLASSIFICATION
    Moosaei, Hossein
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2024, 8 (06): : 953 - 971
  • [2] Least squares approach to K-SVCR multi-class classification with its applications
    Hossein Moosaei
    Milan Hladík
    Annals of Mathematics and Artificial Intelligence, 2022, 90 : 873 - 892
  • [3] Least squares approach to K-SVCR multi-class classification with its applications
    Moosaei, Hossein
    Hladik, Milan
    ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2022, 90 (7-9) : 873 - 892
  • [4] Newton-based approach to solving K-SVCR and Twin-KSVC multi-class classification in the primal space
    Moosaei, Hossein
    Hladik, Milan
    Razzaghi, Mohamad
    Ketabchi, Saeed
    COMPUTERS & OPERATIONS RESEARCH, 2023, 160
  • [5] Semi supervised K–SVCR for multi-class classification
    Srivastava V.P.
    Kapil
    Multimedia Tools and Applications, 2025, 84 (9) : 6737 - 6753
  • [6] New Multi-class Classification Method Based on the SVDD Model
    Yang, Lei
    Ma, Wei-Min
    Tian, Bo
    ADVANCES IN NEURAL NETWORKS - ISNN 2011, PT II, 2011, 6676 : 103 - +
  • [7] Large scale multi-class classification with truncated nuclear norm regularization
    Hu, Yao
    Jin, Zhongming
    Shi, Yi
    Zhang, Debing
    Cai, Deng
    He, Xiaofei
    NEUROCOMPUTING, 2015, 148 : 310 - 317
  • [8] Visual Comparison Based on Multi-class Classification Model
    Shi, Hanqin
    Tao, Liang
    IMAGE AND VIDEO TECHNOLOGY (PSIVT 2017), 2018, 10749 : 75 - 86
  • [9] Robust weighted linear loss twin multi-class support vector regression for large-scale classification
    Qiang, Wenwen
    Zhang, Jinxin
    Zhen, Ling
    Jing, Ling
    SIGNAL PROCESSING, 2020, 170
  • [10] Large-Scale Multi-Class Image-Based Cell Classification With Deep Learning
    Meng, Nan
    Lam, Edmund Y.
    Tsia, Kevin K.
    So, Hayden Kwok-Hay
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2019, 23 (05) : 2091 - 2098