The responsibility weighted Mahalanobis kernel for semi-supervised training of support vector machines for classification

被引:28
|
作者
Reitmaier, Tobias [1 ]
Sick, Bernhard [1 ]
机构
[1] Univ Kassel, Intelligent Embedded Syst Lab, D-34121 Kassel, Germany
关键词
Support vector machine; Pattern classification; Kernel function; Responsibility weighted Mahalanobis kernel; Semi-supervised learning; MANIFOLD REGULARIZATION;
D O I
10.1016/j.ins.2015.06.027
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Kernel functions in support vector machines (SVM) are needed to assess the similarity of input samples in order to classify these samples, for instance. Besides standard kernels such as Gaussian (i.e., radial basis function, RBF) or polynomial kernels, there are also specific kernels tailored to consider structure in the data for similarity assessment. In this paper, we will capture structure in data by means of probabilistic mixture density models, for example Gaussian mixtures in the case of real-valued input spaces. From the distance measures that are inherently contained in these models, e.g., Mahalanobis distances in the case of Gaussian mixtures, we derive a new kernel, the responsibility weighted Mahalanobis (RWM) kernel. Basically, this kernel emphasizes the influence of model components from which any two samples that are compared are assumed to originate (that is, the "responsible" model components). We will see that this kernel outperforms the RBF kernel and other kernels capturing structure in data (such as the LAP kernel in Laplacian SVM) in many applications where partially labeled data are available, i.e., for semi-supervised training of SVM. Other key advantages are that the RWM kernel can easily be used with standard SVM implementations and training algorithms such as sequential minimal optimization, and heuristics known for the parametrization of RBF kernels in a C-SVM can easily be transferred to this new kernel. Properties of the RWM kernel are demonstrated with 20 benchmark data sets and an increasing percentage of labeled samples in the training data. (C) 2015 Elsevier Inc. All rights reserved.
引用
收藏
页码:179 / 198
页数:20
相关论文
共 50 条
  • [31] Robust semi-supervised support vector machines with Laplace kernel-induced correntropy loss functions
    Hongwei Dong
    Liming Yang
    Xue Wang
    Applied Intelligence, 2021, 51 : 819 - 833
  • [32] One novel class of Bézier smooth semi-supervised support vector machines for classification
    En Wang
    Zi-Yang Wang
    Qing Wu
    Neural Computing and Applications, 2021, 33 : 9975 - 9991
  • [33] Semantic Concept Classification by Joint Semi-supervised Learning of Feature Subspaces and Support Vector Machines
    Jiang, Wei
    Chang, Shih-Fu
    Jebara, Tony
    Loui, Alexander C.
    COMPUTER VISION - ECCV 2008, PT IV, PROCEEDINGS, 2008, 5305 : 270 - +
  • [34] Training of support vector machines with mahalanobis kernels
    Abe, SG
    ARTIFICIAL NEURAL NETWORKS: FORMAL MODELS AND THEIR APPLICATIONS - ICANN 2005, PT 2, PROCEEDINGS, 2005, 3697 : 571 - 576
  • [35] Semi-supervised sparse least squares support vector machine based on Mahalanobis distance
    Cui, Li
    Xia, Yingqing
    APPLIED INTELLIGENCE, 2022, 52 (12) : 14294 - 14312
  • [36] Manifold proximal support vector machine for semi-supervised classification
    Wei-Jie Chen
    Yuan-Hai Shao
    Deng-Ke Xu
    Yong-Feng Fu
    Applied Intelligence, 2014, 40 : 623 - 638
  • [37] Semi-supervised sparse least squares support vector machine based on Mahalanobis distance
    Li Cui
    Yingqing Xia
    Applied Intelligence, 2022, 52 : 14294 - 14312
  • [38] A new proximal support vector machine for semi-supervised classification
    Sun, Li
    Jing, Ling
    Xia, Xiaodong
    ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 1076 - 1082
  • [39] Manifold proximal support vector machine for semi-supervised classification
    Chen, Wei-Jie
    Shao, Yuan-Hai
    Xu, Deng-Ke
    Fu, Yong-Feng
    APPLIED INTELLIGENCE, 2014, 40 (04) : 623 - 638
  • [40] Laplacian twin support vector machine for semi-supervised classification
    Qi, Zhiquan
    Tian, Yingjie
    Shi, Yong
    NEURAL NETWORKS, 2012, 35 : 46 - 53