Sparse solution of least-squares twin multi-class support vector machine using l0 and lp-norm for classification and feature selection

被引:7
|
作者
Moosaei, Hossein [1 ,3 ]
Hladik, Milan [2 ,3 ]
机构
[1] Univ JE Purkyne, Fac Sci, Dept Informat, Usti Nad Labem, Czech Republic
[2] Charles Univ Prague, Sch Comp Sci, Fac Math & Phys, Dept Appl Math, Prague, Czech Republic
[3] Prague Univ Econ & Business, Dept Econometr, Prague, Czech Republic
关键词
Multi-class classification; Twin k-class support vector classification; Least-squares; Cardinality-constrained optimization problem; l(p)-norm; Feature selection; ALGORITHM; IMPROVEMENTS;
D O I
10.1016/j.neunet.2023.07.039
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the realm of multi-class classification, the twin K-class support vector classification (Twin-KSVC) generates ternary outputs {-1, 0,+1} by evaluating all training data in a ''1-versus-1-versus-rest'' structure. Recently, inspired by the least-squares version of Twin-KSVC and Twin-KSVC, a new multi-class classifier called improvements on least-squares twin multi-class classification support vector machine (ILSTKSVC) has been proposed. In this method, the concept of structural risk minimization is achieved by incorporating a regularization term in addition to the minimization of empirical risk. Twin-KSVC and its improvements have an influence on classification accuracy. Another aspect influencing classification accuracy is feature selection, which is a critical stage in machine learning, especially when working with high-dimensional datasets. However, most prior studies have not addressed this crucial aspect. In this study, motivated by ILSTKSVC and the cardinality-constrained optimization problem, we propose l(p)-norm least-squares twin multi-class support vector machine (PLSTKSVC) with 0 < p < 1 to perform classification and feature selection at the same time. The technique employed to solve the optimization problems associated with PLSTKSVC is user-friendly, as it involves solving systems of linear equations to obtain an approximate solution for the proposed model. Under certain assumptions, we investigate the properties of the optimum solutions to the related optimization problems. Several real-world datasets were tested using the suggested method. According to the results of our experiments, the proposed method outperforms all current strategies in most datasets in terms of classification accuracy while also reducing the number of features. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页码:471 / 486
页数:16
相关论文
共 50 条
  • [21] A Twin Multi-Class Classification Support Vector Machine
    Yitian Xu
    Rui Guo
    Laisheng Wang
    Cognitive Computation, 2013, 5 : 580 - 588
  • [22] A Twin Multi-Class Classification Support Vector Machine
    Xu, Yitian
    Guo, Rui
    Wang, Laisheng
    COGNITIVE COMPUTATION, 2013, 5 (04) : 580 - 588
  • [23] Feature selection for least squares projection twin support vector machine
    Guo, Jianhui
    Yi, Ping
    Wang, Ruili
    Ye, Qiaolin
    Zhao, Chunxia
    NEUROCOMPUTING, 2014, 144 : 174 - 183
  • [24] A novel kernel-free least squares twin support vector machine for fast and accurate multi-class classification
    Gao, Zheming
    Fang, Shu-Cherng
    Gao, Xuerui
    Luo, Jian
    Medhin, Negash
    KNOWLEDGE-BASED SYSTEMS, 2021, 226 (226)
  • [25] A Multi-Class Classification Weighted Least Squares Twin Support Vector Hypersphere Using Local Density Information
    Ai, Qing
    Wang, Anna
    Zhang, Aihua
    Wang, Yang
    Sun, Haijing
    IEEE ACCESS, 2018, 6 : 17284 - 17291
  • [26] Leaf Recognition for Plant Classification Using Direct Acyclic Graph Based Multi-Class Least Squares Twin Support Vector Machine
    Tomar, Divya
    Agarwal, Sonali
    INTERNATIONAL JOURNAL OF IMAGE AND GRAPHICS, 2016, 16 (03)
  • [27] Fault diagnosis of transformer using multi-class least squares support vector machine
    Department of Electrical Engineering, Xi'an University of Technology, Xi'an 710048, China
    Gaodianya Jishu, 2007, 6 (110-113+132):
  • [28] Feature selection with kernelized multi-class support vector machine
    Guo, Yinan
    Zhang, Zirui
    Tang, Fengzhen
    PATTERN RECOGNITION, 2021, 117
  • [29] Feature Selection for Multi-class Classification using Support Vector Data Description
    Jeong, Daun
    Kang, Dongyeop
    Won, Sangchul
    IECON 2010 - 36TH ANNUAL CONFERENCE ON IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2010,
  • [30] Multi-task least squares twin support vector machine for classification
    Mei, Benshan
    Xu, Yitian
    NEUROCOMPUTING, 2019, 338 : 26 - 33