Fast algorithm about kernel fisher discriminant analysis

被引:0
|
作者
Zhao, Feng [1 ,3 ]
Zhang, Jun-Ying [1 ,2 ]
Liang, Jun-Li [4 ]
机构
[1] School of Computer Science and Engineering, Xidian University, Xi'an 710071, China
[2] National Key Lab. of Radar Signal Processing, Xidian University, Xi'an 710071, China
[3] School of Science, Jinan University, Jinan 250012, China
[4] Institute of Acoustics, Chinese Acad. of Sci., Beijing 100080, China
来源
Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology | 2007年 / 29卷 / 07期
关键词
Algorithms - Feature extraction - Learning systems - Pattern recognition;
D O I
暂无
中图分类号
学科分类号
摘要
The standard Kernel Fisher Discriminant Analysis (KFDA) may suffer from the large computation complexity and the slow speed of feature extraction for the case of large number of training samples. To tackle these problems, a fast algorithm of KFDA is presented. The algorithm firstly proposes an optimized algorithm based on the theory of linear correlation, which finds out a basis of the sub-space spanned by the training samples mapped onto the feature space and which avoids the operation of matrix inversion; Then using the linear combination of the basis to express the optimal projection vectors, and combining with Fisher criterion in the feature space, a novel criterion for the computation of the optimal projection vectors is presented, which only needs to calculate the eigenvalue of a matrix which size is the same as the number of the basis. In addition, the feature extraction for one sample only needs to calculate the kernel functions between the basis and the sample. The experimental results using different datasets demonstrate the validity of the presented algorithm.
引用
收藏
页码:1731 / 1734
相关论文
empty
未找到相关数据