Multi-class support vector machine based on minimization of reciprocal-geometric-margin norms

被引:0
作者
Kusunoki, Yoshifumi [1 ]
Tatsumi, Keiji [2 ]
机构
[1] Osaka Metropolitan Univ, Grad Sch Informat, Gakuen Cho 1-1,Naka Ku, Sakai, Osaka 5998531, Japan
[2] Otemon Gakuin Univ, Fac Sci & Engn, Nishi Ai 2-1-15, Osaka 5678502, Japan
关键词
Machine learning; Support vector machine; Multi-class classification; Geometric margin maximization; FEATURE-SELECTION; CLASSIFICATION;
D O I
10.1016/j.ejor.2025.03.028
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we propose a Support Vector Machine (SVM) method for multi-class classification. It follows multi-objective multi-class SVM (MMSVM), which maximizes class-pair margins on a multi-class linear classifier. The proposed method, called reciprocal-geometric-margin-norm SVM (RGMNSVM) is derived by applying the lp-norm scalarization and convex approximation to MMSVM. Additionally, we develop the margin theory for multi-class linear classification, in order to justify minimization of reciprocal class-pair geometric margins. Experimental results on synthetic datasets explain situations where the proposed RGMNSVM successfully works, while conventional multi-class SVMs fail to fit underlying distributions. Results of classification performance evaluation using benchmark data sets show that RGMNSVM is generally comparable with conventional multi-class SVMs. However, we observe that the proposed approach to geometric margin maximization actually performs better classification accuracy for certain real-world data sets.
引用
收藏
页码:580 / 589
页数:10
相关论文
共 29 条