ADVANCES IN NEURAL NETWORKS, PT I
|
2017年
/
10261卷
关键词:
Empirical feature space;
Least squares support vector machine;
Pattern recognition;
D O I:
10.1007/978-3-319-59072-1_8
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
In this paper, we propose two fast feature selection methods for sparse least squares support vector training in reduced empirical feature space. In the first method, we select the training vectors as the basis vectors of the empirical feature space from the standpoint of the similarity. The complexity of the selection can be lower than that of the conventional method because we use the inner product values of training vectors without linear discriminant analysis or Cholesky factorization which are used by the conventional methods. In the second method, the selection method is forward selection by block addition which is a wrapper method. This method can decrease the size of the kernel matrix in the optimization problem. The selecting time can be shorter than that of the conventional methods because the computational complexity of the selecting basis vectors depends on the size of the kernel matrix. Using benchmark datasets, we show the effectiveness of the proposed methods.