Randomized Nonlinear Component Analysis

被引:0
作者
Lopez-Paz, David [1 ,2 ]
Sra, Suvrit [1 ,3 ]
Smola, Alexander J. [3 ]
Ghahramani, Zoubin [2 ]
Schoelkopf, Bernhard [1 ]
机构
[1] Max Planck Inst Intelligent Syst, Berlin, Germany
[2] Univ Cambridge, Cambridge, England
[3] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2) | 2014年 / 32卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Classical methods such as Principal Component Analysis (PCA) and Canonical Correlation Analysis (CCA) are ubiquitous in statistics. However, these techniques are only able to reveal linear relationships in data. Although nonlinear variants of PCA and CCA have been proposed, these are computationally prohibitive in the large scale. In a separate strand of recent research, randomized methods have been proposed to construct features that help reveal nonlinear patterns in data. For basic tasks such as regression or classification, random features exhibit little or no loss in performance, while achieving drastic savings in computational requirements. In this paper we leverage randomness to design scalable new variants of nonlinear PCA and CCA; our ideas extend to key multivariate analysis tools such as spectral clustering or LDA. We demonstrate our algorithms through experiments on real-world data, on which we compare against the state-of-the-art. A simple R implementation of the presented algorithms is provided.
引用
收藏
页码:1359 / 1367
页数:9
相关论文
共 34 条
[1]  
Achlioptas D, 2002, ADV NEUR IN, V14, P335
[2]  
Andrew G., 2013, P 30 INT C INT C MAC, P1247
[3]  
Arora R., 2012, PROC MACHINE LEARNIN, P34
[4]  
Avron H., 2013, INT C MACHINE LEARNI, P347
[5]   Kernel independent component analysis [J].
Bach, FR ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (01) :1-48
[6]   Restarted block Lanczos bidiagonalization methods [J].
Baglama, James ;
Reichel, Lothar .
NUMERICAL ALGORITHMS, 2006, 43 (03) :251-272
[7]   NEURAL NETWORKS AND PRINCIPAL COMPONENT ANALYSIS - LEARNING FROM EXAMPLES WITHOUT LOCAL MINIMA [J].
BALDI, P ;
HORNIK, K .
NEURAL NETWORKS, 1989, 2 (01) :53-58
[8]  
Chaudhuri K., 2009, P INT C MACH LEARN, P129, DOI DOI 10.1145/1553374.1553391
[9]  
De Bie T, 2005, HANDBOOK OF GEOMETRIC COMPUTING: APPLICATIONS IN PATTERN RECOGNITION, COMPUTER VISION, NEURALCOMPUTING, AND ROBOTICS, P129
[10]  
Hamid R, 2014, PR MACH LEARN RES, V32, P19