Opening the kernel of kernel partial least squares and support vector machines

被引:48
|
作者
Postma, G. J. [1 ]
Krooshof, P. W. T. [1 ]
Buydens, L. M. C. [1 ]
机构
[1] Radboud Univ Nijmegen, Inst Mol & Mat, NL-6500 GL Nijmegen, Netherlands
关键词
Kernel partial least squares; Support vector regression; Kernel transformation; variable selection; Pseudo-samples; Trajectories; REGRESSION; PLS; CLASSIFICATION; PREDICTION; TOOL;
D O I
10.1016/j.aca.2011.04.025
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Kernel partial least squares (KPLS) and support vector regression (SVR) have become popular techniques for regression of complex non-linear data sets. The modeling is performed by mapping the data in a higher dimensional feature space through the kernel transformation. The disadvantage of such a transformation is, however, that information about the contribution of the original variables in the regression is lost. In this paper we introduce a method which can retrieve and visualize the contribution of the variables to the regression model and the way the variables contribute to the regression of complex data sets. The method is based on the visualization of trajectories using so-called pseudo samples representing the original variables in the data. We test and illustrate the proposed method to several synthetic and real benchmark data sets. The results show that for linear and non-linear regression models the important variables were identified with corresponding linear or non-linear trajectories. The results were verified by comparing with ordinary PLS regression and by selecting those variables which were indicated as important and rebuilding a model with only those variables. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:123 / 134
页数:12
相关论文
共 50 条
  • [21] DCA for Sparse Quadratic Kernel-Free Least Squares Semi-Supervised Support Vector Machine
    Sun, Jun
    Qu, Wentao
    MATHEMATICS, 2022, 10 (15)
  • [22] Least squares support vector machine with self-organizing multiple kernel learning and sparsity
    Liu, Chang
    Tang, Lixin
    Liu, Jiyin
    NEUROCOMPUTING, 2019, 331 : 493 - 504
  • [23] Quadratic hyper-surface kernel-free least squares support vector regression
    Ye, Junyou
    Yang, Zhixia
    Li, Zhilin
    INTELLIGENT DATA ANALYSIS, 2021, 25 (02) : 265 - 281
  • [24] Novel Kernel Orthogonal Partial Least Squares for Dominant Sensor Data Extraction
    Chen, Bo-Wei
    IEEE ACCESS, 2020, 8 (08): : 36131 - 36139
  • [25] Fuzzy least squares twin support vector machines
    Sartakhti, Javad Salimi
    Afrabandpey, Homayun
    Ghadiri, Nasser
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2019, 85 : 402 - 409
  • [26] Kernel variable selection for multicategory support vector machines
    Park, Beomjin
    Park, Changyi
    JOURNAL OF MULTIVARIATE ANALYSIS, 2021, 186
  • [27] Fast training of Support Vector Machines with Gaussian kernel
    Fischetti, Matteo
    DISCRETE OPTIMIZATION, 2016, 22 : 183 - 194
  • [28] Optimal kernel selection in twin support vector machines
    Khemchandani, Reshma
    Jayadeva
    Chandra, Suresh
    OPTIMIZATION LETTERS, 2009, 3 (01) : 77 - 88
  • [29] Support vector machines, kernel logistic regression and boosting
    Zhu, J
    Hastie, R
    MULTIPLE CLASSIFIER SYSTEMS, 2002, 2364 : 16 - 26
  • [30] Streamflow forecasting using least-squares support vector machines
    Shabri, Ani
    Suhartono
    HYDROLOGICAL SCIENCES JOURNAL-JOURNAL DES SCIENCES HYDROLOGIQUES, 2012, 57 (07): : 1275 - 1293