Diversity Sampling is an Implicit Regularization for Kernel Methods

被引:10
作者
Fanuel, Michael [1 ]
Schreurs, Joachim [1 ]
Suykens, Johan A. K. [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT, STADIUS Ctr Dynam Syst, Signal Proc & Data Analyt, B-3001 Leuven, Belgium
来源
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE | 2021年 / 3卷 / 01期
基金
欧洲研究理事会;
关键词
kernel; Nystrom; determinantal point process; regularization; NYSTROM;
D O I
10.1137/20M1320031
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Kernel methods have achieved very good performance on large scale regression and classification problems by using the Nystrom method and preconditioning techniques. The Nystrom approximation--based on a subset of landmarks-gives a low rank approximation of the kernel matrix, and is known to provide a form of implicit regularization. We further elaborate on the impact of sampling diverse landmarks for constructing the Nystrom approximation in supervised as well as unsupervised kernel methods. By using Determinantal Point Processes (DPP) for sampling, we obtain additional theoretical results concerning the interplay between diversity and regularization. Empirically, we demonstrate the advantages of training kernel methods based on subsets made of diverse points. In particular, if the dataset has a dense bulk and a sparser tail, we show that Nystrom kernel regression with diverse landmarks increases the accuracy of the regression in sparser regions of the dataset, with respect to a uniform landmark sampling. A greedy heuristic is also proposed to select diverse samples of significant size within large datasets when exact DPP sampling is not practically feasible.
引用
收藏
页码:280 / 297
页数:18
相关论文
共 27 条
[1]  
[Anonymous], 2018, ADV NEURAL INFORM PR
[2]  
[Anonymous], 2017, ADV NEURAL INFORM PR
[3]  
Bach F., 2013, PMLR, V30, P185
[4]   ON MAJORIZATION AND SCHUR PRODUCTS [J].
BAPAT, RB ;
SUNDER, VS .
LINEAR ALGEBRA AND ITS APPLICATIONS, 1985, 72 (DEC) :107-117
[5]   MONTE CARLO WITH DETERMINANTAL POINT PROCESSES [J].
Bardenet, Remi ;
Hardy, Adrien .
ANNALS OF APPLIED PROBABILITY, 2020, 30 (01) :368-417
[6]  
Belkin M., 2018, ADV NEURAL INFORM PR, P2300
[7]  
Belkin M., P 35 INT C MACH LEAR, P541
[8]  
Chen V., 2019, ADV NEURAL INFORM PR, P9392
[9]   On selecting a maximum volume sub-matrix of a matrix and related problems [J].
Civril, Ali ;
Magdon-Ismail, Malik .
THEORETICAL COMPUTER SCIENCE, 2009, 410 (47-49) :4801-4811
[10]   Matrix Approximation and Projective Clustering via Volume Sampling [J].
Deshpande, Amit ;
Rademacher, Luis ;
Vempala, Santosh ;
Wang, Grant .
PROCEEDINGS OF THE SEVENTHEENTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2006, :1117-1126