Robust nonparametric regression based on deep ReLU neural networks

被引:0
|
作者
Chen, Juntong [1 ]
机构
[1] Univ Twente, Dept Appl Math, Twente, Netherlands
基金
欧盟地平线“2020”;
关键词
Nonparametric; Robust estimation; Deep neural; Circumventing the curse of dimensionality; Supremum of an empirical; CONVERGENCE;
D O I
10.1016/j.jspi.2024.106182
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we consider robust nonparametric regression using deep neural networks with ReLU activation function. While several existing theoretically justified methods are geared towards robustness against identical heavy-tailed noise distributions, the rise of adversarial attacks has emphasized the importance of safeguarding estimation procedures against systematic contamination. We approach this statistical issue by shifting our focus towards estimating conditional distributions. To address it robustly, we introduce a novel estimation procedure based on l-estimation. Under a mild model assumption, we establish general non-asymptotic risk bounds for the resulting estimators, showcasing their robustness against contamination, outliers, and model misspecification. We then delve into the application of our approach using deep ReLU neural networks. When the model is well-specified and the regression function belongs to an alpha-Holder class, employing l-type estimation on suitable networks enables the resulting estimators to achieve the minimax optimal rate of convergence. Additionally, we demonstrate that deep l-type estimators can circumvent the curse of dimensionality by assuming the regression function closely resembles the composition of several Holder functions. To attain this, new deep fully-connected ReLU neural networks have been designed to approximate this composition class. This approximation result can be of independent interest.
引用
收藏
页数:25
相关论文
共 50 条
  • [31] Optimal approximation of piecewise smooth functions using deep ReLU neural networks
    Petersen, Philipp
    Voigtlaender, Felix
    NEURAL NETWORKS, 2018, 108 : 296 - 330
  • [32] Error bounds for approximations with deep ReLU neural networks in Ws,p norms
    Guehring, Ingo
    Kutyniok, Gitta
    Petersen, Philipp
    ANALYSIS AND APPLICATIONS, 2020, 18 (05) : 803 - 859
  • [33] On the Bayesian Interpretation of Robust Regression Neural Networks
    Kalina, Jan
    Vidnerova, Petra
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT I, 2024, 15016 : 30 - 40
  • [34] Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev and Besov Spaces
    Siegel, Jonathan W.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [35] Bayesian robust nonparametric regression
    Smith, M
    Kohn, R
    AMERICAN STATISTICAL ASSOCIATION - 1996 PROCEEDINGS OF THE SECTION ON BAYESIAN STATISTICAL SCIENCE, 1996, : 202 - 207
  • [36] Robust nonparametric regression and modality
    Kovac, A
    DEVELOPMENTS IN ROBUST STATISTICS, 2003, : 218 - 227
  • [37] ROBUST NONPARAMETRIC REGRESSION ESTIMATION
    BOENTE, G
    FRAIMAN, R
    JOURNAL OF MULTIVARIATE ANALYSIS, 1989, 29 (02) : 180 - 198
  • [38] Robust nonparametric regression: A review
    Cizek, Pavel
    Sadikoglu, Serhan
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2020, 12 (03)
  • [39] Robust Quantization of Deep Neural Networks
    Kim, Youngseok
    Lee, Junyeol
    Kim, Younghoon
    Seo, Jiwon
    PROCEEDINGS OF THE 29TH INTERNATIONAL CONFERENCE ON COMPILER CONSTRUCTION (CC '20), 2020, : 74 - 84
  • [40] Nonlinear Approximation and (Deep) ReLU Networks
    Daubechies, I.
    DeVore, R.
    Foucart, S.
    Hanin, B.
    Petrova, G.
    CONSTRUCTIVE APPROXIMATION, 2022, 55 (01) : 127 - 172