On the Inconsistency of Kernel Ridgeless Regression in Fixed Dimensions

被引:2
|
作者
Beaglehole, Daniel [1 ]
Belkin, Mikhail [1 ,2 ]
Pandit, Parthe [2 ]
机构
[1] Univ Calif San Diego, Comp Sci & Engn, San Diego, CA 43221 USA
[2] Univ Calif San Diego, Halicioglu Data Sci Inst, San Diego, CA 43221 USA
来源
SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE | 2023年 / 5卷 / 04期
关键词
kernel machines; interpolation; consistency; ridgeless regression; benign overfitting; nonparametric regression;
D O I
10.1137/22M1499819
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
"Benign overfitting," the ability of certain algorithms to interpolate noisy training data and yet perform well out-of-sample, has been a topic of considerable recent interest. We show, using a fixed design setup, that an important class of predictors, kernel machines with translation-invariant kernels, does not exhibit benign overfitting in fixed dimensions. In particular, the estimated predictor does not converge to the ground truth with increasing sample size, for any nonzero regression function and any (even adaptive) bandwidth selection. To prove these results, we give exact expressions for the generalization error and its decomposition in terms of an approximation error and an estimation error that elicits a trade-off based on the selection of the kernel bandwidth. Our results apply to commonly used translation-invariant kernels such as Gaussian, Laplace, and Cauchy.
引用
收藏
页码:854 / 872
页数:19
相关论文
共 50 条
  • [41] Asymptotics for kernel estimate of sliced inverse regression
    Zhu, LX
    Fang, KT
    ANNALS OF STATISTICS, 1996, 24 (03) : 1053 - 1068
  • [42] Local adaptive smoothing in kernel regression estimation
    Zheng, Qi
    Kulasekera, K. B.
    Gallagher, Colin
    STATISTICS & PROBABILITY LETTERS, 2010, 80 (7-8) : 540 - 547
  • [43] Panel nonparametric regression with fixed effects
    Lee, Jungyoon
    Robinson, Peter M.
    JOURNAL OF ECONOMETRICS, 2015, 188 (02) : 346 - 362
  • [44] Model selection for regression on a fixed design
    Baraud, Y
    PROBABILITY THEORY AND RELATED FIELDS, 2000, 117 (04) : 467 - 493
  • [45] Kernel interpolation in Sobolev spaces is not consistent in low dimensions
    Buchholz, Simon
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [46] Kernel regression for estimating regression function and its derivatives with unknown error correlations
    Liu, Sisheng
    Jing, Yang
    METRIKA, 2024, 87 (01) : 1 - 20
  • [47] FEATURE ELIMINATION IN KERNEL MACHINES IN MODERATELY HIGH DIMENSIONS
    Dasgupta, Sayan
    Goldberg, Yair
    Kosorok, Michael R.
    ANNALS OF STATISTICS, 2019, 47 (01) : 497 - 526
  • [48] ON THE OPTIMALITY OF SLICED INVERSE REGRESSION IN HIGH DIMENSIONS
    Lin, Qian
    Li, Xinran
    Huang, Dongming
    Liu, Jun S.
    ANNALS OF STATISTICS, 2021, 49 (01) : 1 - 20
  • [49] Retire: Robust expectile regression in high dimensions
    Man, Rebeka
    Tan, Kean Ming
    Wang, Zian
    Zhou, Wen-Xin
    JOURNAL OF ECONOMETRICS, 2024, 239 (02)
  • [50] Tobit regression model with parameters of increasing dimensions
    Ding, Hao
    Wang, Zhanfeng
    Wu, Yaohua
    STATISTICS & PROBABILITY LETTERS, 2017, 120 : 1 - 7