HOW DO NOISE TAILS IMPACT ON DEEP RELU NETWORKS?

被引:1
作者
Fan, Jianqian [1 ]
Gu, Yihong [1 ]
Zhou, Wen-Xin [2 ]
机构
[1] Princeton Univ, Dept Operat Res & Financial Engn, Princeton, NJ 08544 USA
[2] Univ Illinois, Dept Informat & Decis Sci, Chicago, IL USA
关键词
Robustness; truncation; heavy tails; optimal rates; approximablility of ReLU net-; GEOMETRIZING RATES; CONVERGENCE-RATES; NEURAL-NETWORKS; REGRESSION; APPROXIMATION; ROBUST; BOUNDS;
D O I
10.1214/24-AOS2428
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper investigates the stability of deep ReLU neural networks for nonparametric regression under the assumption that the noise has only a finite pth moment. We unveil how the optimal rate of convergence depends on p, the degree of smoothness and the intrinsic dimension in a class of nonparametric regression functions with hierarchical composition structure when both the adaptive Huber loss and deep ReLU neural networks are used. This optimal rate of convergence cannot be obtained by the ordinary least squares but can be achieved by the Huber loss with a properly chosen parameter that adapts to the sample size, smoothness, and moment parameters. A concentration inequality for the adaptive Huber ReLU neural network estimators with allowable optimization errors is also derived. To establish a matching lower bound within the class of neural network estimators using the Huber loss, we employ a different strategy from the traditional route: constructing a deep ReLU network estimator that has a better empirical loss than the true function and the difference between these two functions furnishes a low bound. This step is related to the Huberization bias, yet more critically to the approximability of deep ReLU networks. As a result, we also contribute some new results on the approximation theory of deep ReLU neural networks.
引用
收藏
页码:1845 / 1871
页数:27
相关论文
共 50 条
  • [31] Analyzing the Noise Robustness of Deep Neural Networks
    Liu, Mengchen
    Liu, Shixia
    Su, Hang
    Cao, Kelei
    Zhu, Jun
    2018 IEEE CONFERENCE ON VISUAL ANALYTICS SCIENCE AND TECHNOLOGY (VAST), 2018, : 60 - 71
  • [32] Analyzing the Noise Robustness of Deep Neural Networks
    Cao, Kelei
    Liu, Mengchen
    Su, Hang
    Wu, Jing
    Zhu, Jun
    Liu, Shixia
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2021, 27 (07) : 3289 - 3304
  • [33] Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware
    Shi, Yuhan
    Oh, Sangheon
    Park, Jaeseoung
    del Valle, Javier
    Salev, Pavel
    Schuller, Ivan K.
    Kuzum, Duygu
    NEUROMORPHIC COMPUTING AND ENGINEERING, 2023, 3 (03):
  • [34] DEEP NEURAL NETWORKS WITH RELU-SINE-EXPONENTIAL ACTIVATIONS BREAK CURSE OF DIMENSIONALITY IN APPROXIMATION ON HOLDER CLASS
    Jiao, Yuling
    Lai, Yanming
    Lu, Xiliang
    Wang, Fengru
    Yang, Jerry zhijian
    Yang, Yuanyuan
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2023, 55 (04) : 3635 - 3649
  • [35] Nearly Optimal Learning Using Sparse Deep ReLU Networks in Regularized Empirical Risk Minimization With Lipschitz Loss
    Huang, Ke
    Liu, Mingming
    Ma, Shujie
    NEURAL COMPUTATION, 2025, 37 (04) : 815 - 870
  • [36] Deep Linear Networks can Benignly Overfit when Shallow Ones Do
    Chatterji, Niladri S.
    Long, Philip M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [37] Noise Removal from Images by Applying Deep Neural Networks
    Ivezic, Dijana
    Livada, Caslav
    31ST INTERNATIONAL CONFERENCE ON ORGANIZATION AND TECHNOLOGY OF MAINTENANCE (OTO 2022), 2023, 592 : 1 - 17
  • [38] Deep Neural Networks with Dependent Weights: Gaussian Process Mixture Limit, Heavy Tails, Sparsity and Compressibility
    Lee, Hoil
    Ayed, Fadhel
    Jung, Paul
    Lee, Juho
    Yang, Hongseok
    Caron, Francois
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [39] Over-parametrized deep neural networks minimizing the empirical risk do not generalize well
    Kohler, Michael
    Krzyzak, Adam
    BERNOULLI, 2021, 27 (04) : 2564 - 2597
  • [40] Structural Robustness to Noise in Consensus Networks: Impact of Degrees and Distances, Fundamental Limits, and Extremal Graphs
    Yazcoglu, Yasin
    Abbas, Waseem
    Shabbir, Mudassir
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (10) : 4777 - 4784