IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS
|
2001年
关键词:
Support Vector Machines;
regression;
classification;
discriminant analysis;
regularization;
D O I:
10.1109/IJCNN.2001.938750
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
Support Vector Machine classifiers aim at constructing a large margin classifier in the feature space, while a nonlinear decision boundary is obtained in the input space by mapping the inputs in a nonlinear way to a possibly infinite dimensional feature space. Mercer's condition is applied to avoid an explicit expression for the nonlinear mapping and the solution follows from a finite dimensional quadratic programming problem. Recently, other classifier formulations related to a regularized form of Fisher Discriminant Analysis have been proposed in the feature space for which practical expressions are obtained in a second step by applying the Mercer condition. In this paper, we relate existing these techniques to least squares support vector machines, for which the solution follows from a linear Karush-Kuhn- Tucker system in the dual space. Based on the link with empirical linear discriminant analysis one can adjust the bias term in order to take prior information on the class distributions into account and to analyze unbalanced training sets.