Sparse solution of least-squares twin multi-class support vector machine using l0 and lp-norm for classification and feature selection

被引:7
|
作者
Moosaei, Hossein [1 ,3 ]
Hladik, Milan [2 ,3 ]
机构
[1] Univ JE Purkyne, Fac Sci, Dept Informat, Usti Nad Labem, Czech Republic
[2] Charles Univ Prague, Sch Comp Sci, Fac Math & Phys, Dept Appl Math, Prague, Czech Republic
[3] Prague Univ Econ & Business, Dept Econometr, Prague, Czech Republic
关键词
Multi-class classification; Twin k-class support vector classification; Least-squares; Cardinality-constrained optimization problem; l(p)-norm; Feature selection; ALGORITHM; IMPROVEMENTS;
D O I
10.1016/j.neunet.2023.07.039
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the realm of multi-class classification, the twin K-class support vector classification (Twin-KSVC) generates ternary outputs {-1, 0,+1} by evaluating all training data in a ''1-versus-1-versus-rest'' structure. Recently, inspired by the least-squares version of Twin-KSVC and Twin-KSVC, a new multi-class classifier called improvements on least-squares twin multi-class classification support vector machine (ILSTKSVC) has been proposed. In this method, the concept of structural risk minimization is achieved by incorporating a regularization term in addition to the minimization of empirical risk. Twin-KSVC and its improvements have an influence on classification accuracy. Another aspect influencing classification accuracy is feature selection, which is a critical stage in machine learning, especially when working with high-dimensional datasets. However, most prior studies have not addressed this crucial aspect. In this study, motivated by ILSTKSVC and the cardinality-constrained optimization problem, we propose l(p)-norm least-squares twin multi-class support vector machine (PLSTKSVC) with 0 < p < 1 to perform classification and feature selection at the same time. The technique employed to solve the optimization problems associated with PLSTKSVC is user-friendly, as it involves solving systems of linear equations to obtain an approximate solution for the proposed model. Under certain assumptions, we investigate the properties of the optimum solutions to the related optimization problems. Several real-world datasets were tested using the suggested method. According to the results of our experiments, the proposed method outperforms all current strategies in most datasets in terms of classification accuracy while also reducing the number of features. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页码:471 / 486
页数:16
相关论文
共 50 条
  • [41] Feature selection for multi-class problems using support vector machines
    Li, GZ
    Yang, J
    Liu, GP
    Xue, L
    PRICAI 2004: TRENDS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 3157 : 292 - 300
  • [42] The best separating decision tree twin support vector machine for multi-class classification
    Shao, Yuan-Hai
    Chen, Wei-Jie
    Huang, Wen-Biao
    Yang, Zhi-Min
    Deng, Nai-Yang
    FIRST INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND QUANTITATIVE MANAGEMENT, 2013, 17 : 1032 - 1038
  • [43] Multi-class Classification using Support Vector Regression Machine with Consistency
    Jia, Wei
    Liang, Junli
    Zhang, Miaohua
    Ye, Xin
    2015 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 2015, : 848 - 851
  • [44] Decision Tree Twin Support Vector Machine Based on Kernel Clustering for Multi-class Classification
    Dou, Qingyun
    Zhang, Li
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 293 - 303
  • [45] Predicting the Listing Status of Chinese Listed Companies Using Twin Multi-class Classification Support Vector Machine
    Zhao, Sining
    Fujita, Hamido
    ADVANCES AND TRENDS IN ARTIFICIAL INTELLIGENCE: FROM THEORY TO PRACTICE, 2019, 11606 : 50 - 62
  • [46] Least squares twin support vector machine classification via maximum one-class within class variance
    Ye, Qiaolin
    Zhao, Chunxia
    Ye, Ning
    OPTIMIZATION METHODS & SOFTWARE, 2012, 27 (01): : 53 - 69
  • [47] Hierarchical Agglomerative Clustering Based Combined Feature Selection and Multi-Class Support Vector Machine for Brain Tumour Classification
    Maya, U. C.
    Meenakshy, K.
    JOURNAL OF MEDICAL IMAGING AND HEALTH INFORMATICS, 2017, 7 (08) : 1714 - 1722
  • [48] Capped L2,p-norm metric based robust least squares twin support vector machine for pattern classification
    Yuan, Chao
    Yang, Liming
    Neural Networks, 2021, 142 : 457 - 478
  • [49] Capped L2,p-norm metric based robust least squares twin support vector machine for pattern classification
    Yuan, Chao
    Yang, Liming
    NEURAL NETWORKS, 2021, 142 : 457 - 478
  • [50] Fuzzy Rules Extraction from Support Vector Machines for Multi-class Classification with Feature Selection
    Chaves, Adriana da Costa F.
    Vellasco, Marley
    Tanscheit, Ricardo
    ADVANCES IN NEURO-INFORMATION PROCESSING, PT II, 2009, 5507 : 386 - 393