Regaining sparsity in kernel principal components

被引:4
作者
García-Osorio, C [1 ]
Fyfe, C [1 ]
机构
[1] Univ Paisley, Appl Computat Intelligenct Res Unit, Paisley PA1 2BE, Renfrew, Scotland
关键词
sparseness; kernel methods;
D O I
10.1016/j.neucom.2004.10.115
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support Vector Machines are supervised regression and classification machines which have the nice property of automatically identifying which of the data points are most important in creating the machine. Kernel Principal Component Analysis (KPCA) is a related technique in that it also relies on linear operations in a feature space but does not have this ability to identify important points. Sparse KPCA goes too far in that it identifies a single data point as most important. We show how, by bagging the data, we may create a compromise which gives us a sparse but not grandmother representation for KPCA. (c) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:398 / 402
页数:5
相关论文
共 5 条
[1]  
Breiman L, 1999, 547 U CAL BERK STAT
[2]   A tutorial on Support Vector Machines for pattern recognition [J].
Burges, CJC .
DATA MINING AND KNOWLEDGE DISCOVERY, 1998, 2 (02) :121-167
[3]  
Efron B., 1994, INTRO BOOTSTRAP, DOI DOI 10.1201/9780429246593
[4]  
SMOLA AJ, 1999, 9904 U WISC
[5]  
Vapnik V, 1999, NATURE STAT LEARNING