Affinity and transformed class probability-based fuzzy least squares support vector machines

被引:19
作者
Borah, Parashjyoti [1 ]
Gupta, Deepak [2 ]
机构
[1] Indian Inst Informat Technol, Dept Comp Sci & Engn, Gauhati, India
[2] Natl Inst Technol, Dept Comp Sci & Engn, Jote, Arunachal Prade, India
关键词
Support vector machine; Fuzzy membership; Class affinity; Class probability; Loss function; Truncated least squares loss; CLASSIFICATION PROBLEMS; OUTLIERS;
D O I
10.1016/j.fss.2022.03.009
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Inspired by the generalization efficiency of affinity and class probability-based fuzzy support vector machine (ACFSVM), a pair of class affinity and nonlinear transformed class probability-based fuzzy least squares support vector machine approaches is proposed. The proposed approaches handle the class imbalance problem by employing cost-sensitive learning, and by utilizing the samples' class probability determined using a novel nonlinear probability equation that adjusts itself with class size. Further, the sensitivity to outliers and noise is reduced with the help of each sample's affinity to its class obtained with the help of least squares one-class support vector machine. The first proposed approach incorporates fuzzy membership values, computed using transformed class probability and class affinity, into the objective function of LS-SVM type formulation, and introduces a new cost sensitive term based on the class cardinalities to normalize the effect of the class imbalance problem. The inherent noise and outlier sensitivity of the quadratic least squares loss function of the first approach is further reduced in the second proposed approach by truncating the quadratic growth of the loss function at a specified score. Thus, the concerns due to noise and outliers are further handled at the optimization level. However, the employed truncated loss function of the second approach takes a non-convex structure, which in turn, is resolved using ConCave-Convex Procedure (CCCP) for global convergence. Numerical experiments on artificial and real-world datasets of different imbalance ratio establish the effectiveness of the proposed approaches. (C) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:203 / 235
页数:33
相关论文
共 41 条
[11]   A Two-Norm Squared Fuzzy-Based Least Squares Twin Parametric-Margin Support Vector Machine [J].
Borah, Parashjyoti ;
Gupta, Deepak .
MACHINE INTELLIGENCE AND SIGNAL ANALYSIS, 2019, 748 :119-134
[12]  
Borah P, 2018, 2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), P412, DOI 10.1109/SSCI.2018.8628818
[13]   Support Vector Machines with the Ramp Loss and the Hard Margin Loss [J].
Brooks, J. Paul .
OPERATIONS RESEARCH, 2011, 59 (02) :467-479
[14]   Least squares one-class support vector machine [J].
Choi, Young-Sik .
PATTERN RECOGNITION LETTERS, 2009, 30 (13) :1236-1240
[15]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[16]   Class imbalance learning via a fuzzy total margin based support vector machine [J].
Dai, Hong-Liang .
APPLIED SOFT COMPUTING, 2015, 31 :172-184
[17]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[18]   Entropy-based fuzzy support vector machine for imbalanced datasets [J].
Fan, Qi ;
Wang, Zhe ;
Li, Dongdong ;
Gao, Daqi ;
Zha, Hongyuan .
KNOWLEDGE-BASED SYSTEMS, 2017, 115 :87-99
[19]   Chunk incremental learning for cost-sensitive hinge loss support vector machine [J].
Gu, Bin ;
Quan, Xin ;
Gu, Yunhua ;
Sheng, Victor S. ;
Zheng, Guansheng .
PATTERN RECOGNITION, 2018, 83 :196-208
[20]   New Fuzzy Support Vector Machine for the Class Imbalance Problem in Medical Datasets Classification [J].
Gu, Xiaoqing ;
Ni, Tongguang ;
Wang, Hongyuan .
SCIENTIFIC WORLD JOURNAL, 2014,