Data sharpening methods for bias reduction in nonparametric regression

被引:0
|
作者
Choi, E [1 ]
Hall, P [1 ]
Rousson, V [1 ]
机构
[1] Australian Natl Univ, Ctr Math & Applicat, Canberra, ACT 0200, Australia
来源
ANNALS OF STATISTICS | 2000年 / 28卷 / 05期
关键词
bandwidth; curse of dimensionality; design sparsity; explanatory variables; kernel methods; local-linear estimator; local-polynomial methods; Nadaraya-Watson estimator; response variables; smoothing;
D O I
暂无
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider methods for kernel regression when the explanatory and/or response variables are adjusted prior to substitution into a conventional estimator. This "data-sharpening" procedure is designed to preserve the advantages of relatively simple, low-order techniques, for example, their robustness against design sparsity problems, yet attain the sorts of bias reductions that are commonly associated only with high-order methods. We consider Nadaraya-Watson and local-linear methods in detail, although data sharpening is applicable more widely. One approach in particular is found to give excellent performance. It involves adjusting both the explanatory and the response variables prior to substitution into a local linear estimator. The change to the explanatory variables enhances resistance of the estimator to design sparsity, by increasing the density of design points in places where the original density had been low When combined with adjustment of the response variables, it produces a reduction in bias by an order of magnitude. Moreover, these advantages are available in multivariate settings. The data-sharpening step is simple to implement, since it is explicitly defined. It does not involve functional inversion, solution of equations or use of pilot bandwidths.
引用
收藏
页码:1339 / 1355
页数:17
相关论文
共 50 条
  • [1] High-order data sharpening with dependent errors for regression bias reduction
    He, Xuyang
    Jiang, Yuexiang
    Wang, Jiazhen
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2019, 48 (23) : 5748 - 5755
  • [2] Automatic and asymptotically optimal data sharpening for nonparametric regression
    Yao, Fang
    Lee, Thomas C. M.
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2009, 139 (12) : 4017 - 4030
  • [3] Bandwidth selection for a data sharpening estimator in nonparametric regression
    Naito, Kanta
    Yoshizaki, Masahiro
    JOURNAL OF MULTIVARIATE ANALYSIS, 2009, 100 (07) : 1465 - 1486
  • [4] A BIAS REDUCTION THEOREM WITH APPLICATIONS IN NONPARAMETRIC REGRESSION
    EUBANK, RL
    SPECKMAN, PL
    SCANDINAVIAN JOURNAL OF STATISTICS, 1991, 18 (03) : 211 - 222
  • [5] A MULTIPLICATIVE BIAS REDUCTION METHOD FOR NONPARAMETRIC REGRESSION
    LINTON, O
    NIELSEN, JP
    STATISTICS & PROBABILITY LETTERS, 1994, 19 (03) : 181 - 187
  • [6] BIAS REDUCTION FOR NONPARAMETRIC AND SEMIPARAMETRIC REGRESSION MODELS
    Cheng, Ming-Yen
    Huang, Tao
    Liu, Peng
    Peng, Heng
    STATISTICA SINICA, 2018, 28 (04) : 2749 - 2770
  • [7] Methodology for nonparametric bias reduction in kernel regression estimation
    Slaoui, Yousri
    MONTE CARLO METHODS AND APPLICATIONS, 2023, 29 (01): : 55 - 77
  • [8] Data sharpening methods in multivariate local quadratic regression
    Wang, Xiaoying
    Jiang, Song
    Yin, Junping
    JOURNAL OF MULTIVARIATE ANALYSIS, 2012, 105 (01) : 258 - 275
  • [9] Bias reduction by projection on parametric models in Hilbertian nonparametric regression
    Lee, Y. K.
    Hong, H.
    Kim, D.
    Park, B. U.
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (03) : 634 - 653
  • [10] Bias reduction by projection on parametric models in Hilbertian nonparametric regression
    Y. K. Lee
    H. Hong
    D. Kim
    B. U. Park
    Journal of the Korean Statistical Society, 2021, 50 : 634 - 653