Least squares Support Vector Machine regression for discriminant analysis

被引:7
作者
Van Gestel, T [1 ]
Suykens, JAK [1 ]
De Brabanter, J [1 ]
De Moor, B [1 ]
Vandewalle, J [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn, ESAT SISTA, B-3001 Louvain, Belgium
来源
IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS | 2001年
关键词
Support Vector Machines; regression; classification; discriminant analysis; regularization;
D O I
10.1109/IJCNN.2001.938750
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support Vector Machine classifiers aim at constructing a large margin classifier in the feature space, while a nonlinear decision boundary is obtained in the input space by mapping the inputs in a nonlinear way to a possibly infinite dimensional feature space. Mercer's condition is applied to avoid an explicit expression for the nonlinear mapping and the solution follows from a finite dimensional quadratic programming problem. Recently, other classifier formulations related to a regularized form of Fisher Discriminant Analysis have been proposed in the feature space for which practical expressions are obtained in a second step by applying the Mercer condition. In this paper, we relate existing these techniques to least squares support vector machines, for which the solution follows from a linear Karush-Kuhn- Tucker system in the dual space. Based on the link with empirical linear discriminant analysis one can adjust the bias term in order to take prior information on the class distributions into account and to analyze unbalanced training sets.
引用
收藏
页码:2445 / 2450
页数:6
相关论文
共 50 条
  • [1] Mapped least squares support vector machine regression
    Zheng, S
    Sun, YQ
    Tian, JW
    Liu, J
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2005, 19 (03) : 459 - 475
  • [2] A robust least squares support vector machine for regression and classification with noise
    Yang, Xiaowei
    Tan, Liangjun
    He, Lifang
    NEUROCOMPUTING, 2014, 140 : 41 - 52
  • [3] Least squares support vector machine regression with additional constrains
    Ye Hong
    Sun, Bing-Yu
    Wang, Ru Jing
    PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE: 50 YEARS' ACHIEVEMENTS, FUTURE DIRECTIONS AND SOCIAL IMPACTS, 2006, : 682 - 684
  • [4] Least squares support vector machine classifiers
    Suykens, JAK
    Vandewalle, J
    NEURAL PROCESSING LETTERS, 1999, 9 (03) : 293 - 300
  • [5] Least Squares Support Vector Machine Classifiers
    J.A.K. Suykens
    J. Vandewalle
    Neural Processing Letters, 1999, 9 : 293 - 300
  • [6] A New Robust Least Squares Support Vector Machine for Regression with Outliers
    Lu You
    Liu Jizhen
    Qu Yaxin
    CEIS 2011, 2011, 15
  • [7] Primal least squares twin support vector regression
    Huang, Hua-juan
    Ding, Shi-fei
    Shi, Zhong-zhi
    JOURNAL OF ZHEJIANG UNIVERSITY-SCIENCE C-COMPUTERS & ELECTRONICS, 2013, 14 (09): : 722 - 732
  • [8] Recursive reduced least squares support vector regression
    Zhao, Yongping
    Sun, Jianguo
    PATTERN RECOGNITION, 2009, 42 (05) : 837 - 842
  • [9] A robust weighted least squares support vector regression based on least trimmed squares
    Chen, Chuanfa
    Yan, Changqing
    Li, Yanyan
    NEUROCOMPUTING, 2015, 168 : 941 - 946
  • [10] Classification using least squares support vector machine for reliability analysis
    Zhi-wei Guo
    Guang-chen Bai
    Applied Mathematics and Mechanics, 2009, 30 : 853 - 864