HOW DO NOISE TAILS IMPACT ON DEEP RELU NETWORKS?

被引:1
作者
Fan, Jianqian [1 ]
Gu, Yihong [1 ]
Zhou, Wen-Xin [2 ]
机构
[1] Princeton Univ, Dept Operat Res & Financial Engn, Princeton, NJ 08544 USA
[2] Univ Illinois, Dept Informat & Decis Sci, Chicago, IL USA
关键词
Robustness; truncation; heavy tails; optimal rates; approximablility of ReLU net-; GEOMETRIZING RATES; CONVERGENCE-RATES; NEURAL-NETWORKS; REGRESSION; APPROXIMATION; ROBUST; BOUNDS;
D O I
10.1214/24-AOS2428
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper investigates the stability of deep ReLU neural networks for nonparametric regression under the assumption that the noise has only a finite pth moment. We unveil how the optimal rate of convergence depends on p, the degree of smoothness and the intrinsic dimension in a class of nonparametric regression functions with hierarchical composition structure when both the adaptive Huber loss and deep ReLU neural networks are used. This optimal rate of convergence cannot be obtained by the ordinary least squares but can be achieved by the Huber loss with a properly chosen parameter that adapts to the sample size, smoothness, and moment parameters. A concentration inequality for the adaptive Huber ReLU neural network estimators with allowable optimization errors is also derived. To establish a matching lower bound within the class of neural network estimators using the Huber loss, we employ a different strategy from the traditional route: constructing a deep ReLU network estimator that has a better empirical loss than the true function and the difference between these two functions furnishes a low bound. This step is related to the Huberization bias, yet more critically to the approximability of deep ReLU networks. As a result, we also contribute some new results on the approximation theory of deep ReLU neural networks.
引用
收藏
页码:1845 / 1871
页数:27
相关论文
共 50 条
  • [41] Filtering Out High Noise Data for Distributed Deep Neural Networks
    Cui, Yangguang
    Li, Liying
    Tao, Zhe
    Chen, Mingsong
    Wei, Tongquan
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2023, 20 (01) : 101 - 111
  • [42] On the expected behaviour of noise regularised deep neural networks as Gaussian processes
    Pretorius, Arnu
    Kamper, Herman
    Kroon, Steve
    PATTERN RECOGNITION LETTERS, 2020, 138 : 75 - 81
  • [43] Density regression and uncertainty quantification with Bayesian deep noise neural networks
    Zhang, Daiwei
    Liu, Tianci
    Kang, Jian
    STAT, 2023, 12 (01):
  • [44] Disrupting Audio Event Detection Deep Neural Networks with White Noise
    dos Santos, Rodrigo
    Kassetty, Ashwitha
    Nilizadeh, Shirin
    TECHNOLOGIES, 2021, 9 (03)
  • [45] Artificial Intelligence, Machine Learning, Deep Learning, and Cognitive Computing: What Do These Terms Mean and How Will They Impact Health Care?
    Bini, Stefano A.
    JOURNAL OF ARTHROPLASTY, 2018, 33 (08) : 2358 - 2361
  • [46] UNDERSTANDING HOW DEEP BELIEF NETWORKS PERFORM ACOUSTIC MODELLING
    Mohamed, Abdel-rahman
    Hinton, Geoffrey
    Penn, Gerald
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 4273 - 4276
  • [47] Mitigating the impact of adversarial attacks in very deep networks
    Hassanin, Mohammed
    Radwan, Ibrahim
    Moustafa, Nour
    Tahtali, Murat
    Kumar, Neeraj
    APPLIED SOFT COMPUTING, 2021, 105 (105)
  • [48] How do tasks impact the reliability of fMRI functional connectivity?
    Rai, Shefali
    Graff, Kirk
    Tansey, Ryann
    Bray, Signe
    HUMAN BRAIN MAPPING, 2024, 45 (03)
  • [49] Deep convolutional networks do not classify based on global object shape
    Baker, Nicholas
    Lu, Hongjing
    Erlikhman, Gennady
    Kellman, Philip J.
    PLOS COMPUTATIONAL BIOLOGY, 2018, 14 (12)
  • [50] On Two-Layer Hierarchical Networks How Does the Brain Do This?
    Beiu, Valeriu
    Madappuram, Basheer A. M.
    Kelly, Peter M.
    McDaid, Liam J.
    NANO-NET, 2009, 20 : 231 - +