Least squares Support Vector Machine regression for discriminant analysis

被引:7
作者
Van Gestel, T [1 ]
Suykens, JAK [1 ]
De Brabanter, J [1 ]
De Moor, B [1 ]
Vandewalle, J [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn, ESAT SISTA, B-3001 Louvain, Belgium
来源
IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS | 2001年
关键词
Support Vector Machines; regression; classification; discriminant analysis; regularization;
D O I
10.1109/IJCNN.2001.938750
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support Vector Machine classifiers aim at constructing a large margin classifier in the feature space, while a nonlinear decision boundary is obtained in the input space by mapping the inputs in a nonlinear way to a possibly infinite dimensional feature space. Mercer's condition is applied to avoid an explicit expression for the nonlinear mapping and the solution follows from a finite dimensional quadratic programming problem. Recently, other classifier formulations related to a regularized form of Fisher Discriminant Analysis have been proposed in the feature space for which practical expressions are obtained in a second step by applying the Mercer condition. In this paper, we relate existing these techniques to least squares support vector machines, for which the solution follows from a linear Karush-Kuhn- Tucker system in the dual space. Based on the link with empirical linear discriminant analysis one can adjust the bias term in order to take prior information on the class distributions into account and to analyze unbalanced training sets.
引用
收藏
页码:2445 / 2450
页数:6
相关论文
共 50 条
  • [41] All-in-one multicategory least squares nonparallel hyperplanes support vector machine
    Kumar, Deepak
    Thakur, Manoj
    PATTERN RECOGNITION LETTERS, 2018, 105 : 165 - 174
  • [42] Least squares twin support vector machine with asymmetric squared loss
    Qing W.
    Feiyan L.
    Hengchang Z.
    Jiulun F.
    Xiaofeng G.
    Journal of China Universities of Posts and Telecommunications, 2023, 30 (01): : 1 - 16
  • [43] Semi-supervised matrixized least squares support vector machine
    Pei, Huimin
    Wang, Kuaini
    Zhong, Ping
    APPLIED SOFT COMPUTING, 2017, 61 : 72 - 87
  • [44] A least squares twin support vector machine method with uncertain data
    Xiao, Yanshan
    Liu, Jinneng
    Wen, Kairun
    Liu, Bo
    Zhao, Liang
    Kong, Xiangjun
    APPLIED INTELLIGENCE, 2023, 53 (09) : 10668 - 10684
  • [45] An instrumental least squares support vector machine for nonlinear system identification
    Laurain, Vincent
    Toth, Roland
    Piga, Dario
    Zheng, Wei Xing
    AUTOMATICA, 2015, 54 : 340 - 347
  • [46] Regional Electricity Consumption based on Least Squares Support Vector Machine
    Wang, Zongwu
    Niu, Yantao
    FIFTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2012): ALGORITHMS, PATTERN RECOGNITION AND BASIC TECHNOLOGIES, 2013, 8784
  • [47] Least squares twin support vector machine with Universum data for classification
    Xu, Yitian
    Chen, Mei
    Li, Guohui
    INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2016, 47 (15) : 3637 - 3645
  • [48] Least squares support vector machine with parametric margin for binary classification
    Yang, Zhixia
    Zhou, Zhe
    Jiang, Yaolin
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2016, 30 (05) : 2897 - 2904
  • [49] A fuzzy universum least squares twin support vector machine (FULSTSVM)
    Richhariya, B.
    Tanveer, M.
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (14) : 11411 - 11422
  • [50] Improved least squares support vector machine based on metric learning
    Dewei Li
    Yingjie Tian
    Neural Computing and Applications, 2018, 30 : 2205 - 2215