Least squares Support Vector Machine regression for discriminant analysis

被引:7
|
作者
Van Gestel, T [1 ]
Suykens, JAK [1 ]
De Brabanter, J [1 ]
De Moor, B [1 ]
Vandewalle, J [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn, ESAT SISTA, B-3001 Louvain, Belgium
关键词
Support Vector Machines; regression; classification; discriminant analysis; regularization;
D O I
10.1109/IJCNN.2001.938750
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support Vector Machine classifiers aim at constructing a large margin classifier in the feature space, while a nonlinear decision boundary is obtained in the input space by mapping the inputs in a nonlinear way to a possibly infinite dimensional feature space. Mercer's condition is applied to avoid an explicit expression for the nonlinear mapping and the solution follows from a finite dimensional quadratic programming problem. Recently, other classifier formulations related to a regularized form of Fisher Discriminant Analysis have been proposed in the feature space for which practical expressions are obtained in a second step by applying the Mercer condition. In this paper, we relate existing these techniques to least squares support vector machines, for which the solution follows from a linear Karush-Kuhn- Tucker system in the dual space. Based on the link with empirical linear discriminant analysis one can adjust the bias term in order to take prior information on the class distributions into account and to analyze unbalanced training sets.
引用
收藏
页码:2445 / 2450
页数:6
相关论文
共 50 条
  • [1] Application of Least Squares Support Vector Machine for Regression to Reliability Analysis
    Guo Zhiwei
    Bai Guangchen
    CHINESE JOURNAL OF AERONAUTICS, 2009, 22 (02) : 160 - 166
  • [2] Mapped least squares support vector machine regression
    Zheng, S
    Sun, YQ
    Tian, JW
    Liu, J
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2005, 19 (03) : 459 - 475
  • [3] Hydrocarbon discriminant technique based on least squares support vector machine
    Xu, Jianhua
    Zhang, Xuegong
    Li, Yanda
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2002, 15 (04): : 507 - 510
  • [4] Least squares support vector machine regression with additional constrains
    Ye Hong
    Sun, Bing-Yu
    Wang, Ru Jing
    PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE: 50 YEARS' ACHIEVEMENTS, FUTURE DIRECTIONS AND SOCIAL IMPACTS, 2006, : 682 - 684
  • [5] Least squares support vector machine regression with boundary condition
    Yan, WW
    Zhang, MG
    Zhang, CK
    Shao, HH
    PROCEEDINGS OF 2003 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS & SIGNAL PROCESSING, PROCEEDINGS, VOLS 1 AND 2, 2003, : 79 - 81
  • [6] Least Squares Support Vector Machine Regression with Equality Constraints
    Liu, Kun
    Sun, Bing-Yu
    INTERNATIONAL CONFERENCE ON APPLIED PHYSICS AND INDUSTRIAL ENGINEERING 2012, PT C, 2012, 24 : 2227 - 2230
  • [7] Weighted Least Squares Twin Support Vector Machine For Regression With Noise
    Li, Juntao
    Jing, Junchang
    Cao, Yimin
    Xiao, Huimin
    PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, : 9888 - 9893
  • [8] A New Robust Least Squares Support Vector Machine for Regression with Outliers
    Lu You
    Liu Jizhen
    Qu Yaxin
    CEIS 2011, 2011, 15
  • [9] Fast method for sparse least squares support vector regression machine
    College of Energy and Power Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
    Kongzhi yu Juece Control Decis, 2008, 12 (1347-1352):
  • [10] A robust least squares support vector machine for regression and classification with noise
    Yang, Xiaowei
    Tan, Liangjun
    He, Lifang
    NEUROCOMPUTING, 2014, 140 : 41 - 52