On the eigenspectrum of the Gram matrix and the generalization error of kernel-PCA

被引:94
作者
Shawe-Taylor, J [1 ]
Williams, CKI
Cristianini, N
Kandola, J
机构
[1] Univ Southampton, Sch Elect & Comp Sci, Southampton SO17 1BJ, Hants, England
[2] Univ Edinburgh, Div Informat, Edinburgh EH1 2QL, Midlothian, Scotland
[3] Univ Calif Davis, Dept Stat, Davis, CA 95616 USA
[4] Merrill Lynch Quantitat Analyt Div, London EC1A 1HQ, England
基金
英国工程与自然科学研究理事会;
关键词
concentration bounds; Gram matrices; kernel methods; principal components analysis (PCA); Rademacher complexity; spectra of random matrices; statistical learning theory;
D O I
10.1109/TIT.2005.850052
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, the relationships between the eigenvalues of the m x m Gram matrix K for a kernel kappa(center dot,center dot) corresponding to a sample x(1),... , x(m) drawn from a density p(x) and-the eigenvalues of the corresponding continuous eigenproblem is analyzed. The differences between the two spectra are bounded and a performance bound on kernel principal component analysis (PCA) is provided showing that good performance can be expected even in very-high-dimensional feature spaces provided the sample eigenvalues fall sufficiently quickly.
引用
收藏
页码:2510 / 2522
页数:13
相关论文
共 30 条
[1]   On the concentration of eigenvalues of random symmetric matrices [J].
Alon, N ;
Krivelevich, M ;
Vu, VH .
ISRAEL JOURNAL OF MATHEMATICS, 2002, 131 (1) :259-267
[2]   ASYMPTOTIC THEORY FOR PRINCIPAL COMPONENT ANALYSIS [J].
ANDERSON, TW .
ANNALS OF MATHEMATICAL STATISTICS, 1963, 34 (01) :122-&
[3]  
[Anonymous], P 17 INT C MACH LEAR
[4]  
[Anonymous], 2004, KERNEL METHODS PATTE
[5]  
[Anonymous], 2000, On the distribution of the largest principal component
[6]  
[Anonymous], 1998, NEURAL NETWORKS MACH
[7]  
[Anonymous], 1998, P ACM SIAM S DISCR A
[8]  
Azuma K., 1967, TOHOKU MATH J, V19, P357, DOI DOI 10.2748/TMJ/1178243286
[9]  
BAKER CTH, 1977, NUMERICAL TREATMENT
[10]  
Bartlett P. L., 2003, Journal of Machine Learning Research, V3, P463, DOI 10.1162/153244303321897690