Robust nonparametric regression based on deep ReLU neural networks

被引:0
|
作者
Chen, Juntong [1 ]
机构
[1] Univ Twente, Dept Appl Math, Twente, Netherlands
基金
欧盟地平线“2020”;
关键词
Nonparametric; Robust estimation; Deep neural; Circumventing the curse of dimensionality; Supremum of an empirical; CONVERGENCE;
D O I
10.1016/j.jspi.2024.106182
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this paper, we consider robust nonparametric regression using deep neural networks with ReLU activation function. While several existing theoretically justified methods are geared towards robustness against identical heavy-tailed noise distributions, the rise of adversarial attacks has emphasized the importance of safeguarding estimation procedures against systematic contamination. We approach this statistical issue by shifting our focus towards estimating conditional distributions. To address it robustly, we introduce a novel estimation procedure based on l-estimation. Under a mild model assumption, we establish general non-asymptotic risk bounds for the resulting estimators, showcasing their robustness against contamination, outliers, and model misspecification. We then delve into the application of our approach using deep ReLU neural networks. When the model is well-specified and the regression function belongs to an alpha-Holder class, employing l-type estimation on suitable networks enables the resulting estimators to achieve the minimax optimal rate of convergence. Additionally, we demonstrate that deep l-type estimators can circumvent the curse of dimensionality by assuming the regression function closely resembles the composition of several Holder functions. To attain this, new deep fully-connected ReLU neural networks have been designed to approximate this composition class. This approximation result can be of independent interest.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION
    Schmidt-Hieber, Johannes
    ANNALS OF STATISTICS, 2020, 48 (04): : 1875 - 1897
  • [2] DISCUSSION OF: "NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION"
    Shamir, Ohad
    ANNALS OF STATISTICS, 2020, 48 (04): : 1911 - 1915
  • [3] Nonparametric regression with modified ReLU networks
    Beknazaryan, Aleksandr
    Sang, Hailin
    STATISTICS & PROBABILITY LETTERS, 2022, 190
  • [4] Nonparametric Regression Using Over-parameterized Shallow ReLU Neural Networks
    Yang, Yunfei
    Zhou, Ding-Xuan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 35
  • [5] NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION (vol 48, pg 1875, 2020)
    Schmidt-Hieber, Johannes
    Vu, Don
    ANNALS OF STATISTICS, 2024, 52 (01): : 413 - 414
  • [6] Nonparametric Expectile Regression Meets Deep Neural Networks: A Robust Nonlinear Variable Selection method
    Yang, Rui
    Song, Yunquan
    Statistical Analysis and Data Mining, 2024, 17 (06)
  • [7] Factor Augmented Sparse Throughput Deep ReLU Neural Networks for High Dimensional Regression
    Fan, Jianqing
    Gu, Yihong
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (548) : 2680 - 2694
  • [8] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [9] Nonparametric regression on low-dimensional manifolds using deep ReLU networks: function approximation and statistical recovery
    Chen, Minshuo
    Jiang, Haoming
    Liao, Wenjing
    Zhao, Tuo
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2022, 11 (04) : 1203 - 1253
  • [10] Conformal prediction for robust deep nonparametric regression
    Kong, Jingsen
    Liu, Yiming
    Yang, Guangren
    Zhou, Wang
    STATISTICAL PAPERS, 2025, 66 (01)