Robust classifier learning with fuzzy class labels for large-margin support vector machines

被引:18
作者
Yang, Chan-Yun [1 ,2 ]
Chou, Jui-Jen [3 ]
Lian, Feng-Li [4 ]
机构
[1] Natl Taipei Univ, Dept Elect Engn, New Taipei City 23741, Taiwan
[2] Taipei Chengshih Univ Sci & Technol, Dept Mech Engn, Taipei 11202, Taiwan
[3] Natl Taiwan Univ, Dept Bioind Mechatron Engn, Taipei 10617, Taiwan
[4] Natl Taiwan Univ, Dept Elect Engn, Taipei 10617, Taiwan
关键词
Fuzzy class label; Membership function; Loss function; Lagrange constraint; Support vector machines; Classification; Machine learning; Pattern recognition;
D O I
10.1016/j.neucom.2012.04.009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Using class label fuzzification, this study develops the idea of refreshing the attitude of the difficult training examples and gaining a more robust classifier for large-margin support vector machines (SVMs). Fuzzification relaxes the specific hard-limited Lagrangian constraints of the difficult examples, extends the infeasible space of the canonical constraints for optimization, and reconfigures the consequent decision function with a wider margin. With the margin, a classifier capable of achieving a high generalization performance can be more robust. This paper traces the rationale for such a robust performance back to the changes of governing loss function. From the aspect of loss function, the reasons are causally explained. In the study, we also demonstrate a two-stage system for experiments to show the changes corresponding to the label fuzzification. The system first captures the difficult examples in the first-stage preprocessor, and assigns them various fuzzified class labels. Three types of membership functions, including a constant, a linear, and a sigmoidal membership function, are designated in the preprocessor to manipulate the within-class correlations of the difficult examples for reference of the fuzzification. The consequent performance benchmarks confirm the robust and generalized ability due to the label fuzzification. Since the change of y(i)' is fundamental, the idea may be transplanted to different prototypes of SVM. (C) 2012 Elsevier B.V. All rights reserved,
引用
收藏
页码:1 / 14
页数:14
相关论文
共 51 条
  • [41] Smola A. J., 2000, Advances in large margin classifiers
  • [42] A novel measure for quantifying the topology preservation of self-organizing feature maps
    Su, MC
    Chang, HT
    Chou, CH
    [J]. NEURAL PROCESSING LETTERS, 2002, 15 (02) : 137 - 145
  • [43] Fuzzy support vector machines for solving two-class problems
    Tsang, ECC
    Yeung, DS
    Chan, PPK
    [J]. 2003 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-5, PROCEEDINGS, 2003, : 1080 - 1083
  • [44] Fuzzy least squares support vector machines for multiclass problems
    Tsujinishi, D
    Abe, S
    [J]. NEURAL NETWORKS, 2003, 16 (5-6) : 785 - 792
  • [45] Vapnik V., 1995, The nature of statistical learning theory
  • [46] VLACHOS P, 1989, STATLIB
  • [47] Robust truncated hinge loss support vector machines
    Wu, Yichao
    Liu, Yufeng
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2007, 102 (479) : 974 - 983
  • [48] Margin calibration in SVM class-imbalanced learning
    Yang, Chan-Yun
    Yang, Jr-Syu
    Wang, Jian-Jun
    [J]. NEUROCOMPUTING, 2009, 73 (1-3) : 397 - 411
  • [49] Stray Example Sheltering by Loss Regularized SVM and kNN Preprocessor
    Yang, Chan-Yun
    Hsu, Che-Chang
    Yang, Jr-Syu
    [J]. NEURAL PROCESSING LETTERS, 2009, 29 (01) : 7 - 27
  • [50] Yongjun Ma, 2004, Fifth World Congress on Intelligent Control and Automation (IEEE Cat. No.04EX788), P4137, DOI 10.1109/WCICA.2004.1342286