Convergence analysis of deep residual networks

被引:2
作者
Huang, Wentao [1 ]
Zhang, Haizhang [1 ]
机构
[1] Sun Yat Sen Univ, Sch Math Zhuhai, Zhuhai 519082, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; deep residual networks; ReLU networks; convolutional neural networks; convergence; RELU NETWORKS; ERROR-BOUNDS; APPROXIMATION; WIDTH;
D O I
10.1142/S021953052350029X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Various powerful deep neural network architectures have made great contributions to the exciting successes of deep learning in the past two decades. Among them, deep Residual Networks (ResNets) are of particular importance because they demonstrated great usefulness in computer vision by winning the first place in many deep learning competitions. Also, ResNets are the first class of neural networks in the development history of deep learning that are really deep. It is of mathematical interest and practical meaning to understand the convergence of deep ResNets. We aim at studying the convergence of deep ResNets as the depth tends to infinity in terms of the parameters of the networks. Toward this purpose, we first give a matrix-vector description of general deep neural networks with shortcut connections and formulate an explicit expression for the networks by using the notion of activation matrices. The convergence is then reduced to the convergence of two series involving infinite products of non-square matrices. By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets. We also conduct experiments on benchmark machine learning data to illustrate the potential usefulness of the results.
引用
收藏
页码:351 / 382
页数:32
相关论文
共 46 条
  • [1] The Gap between Theory and Practice in Function Approximation with Deep Neural Networks
    Adcock, Ben
    Dexter, Nick
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2021, 3 (02): : 624 - 655
  • [2] Blanchard M., 2020, ARXIV
  • [3] Cho Youngmin, 2009, 23 ANN C NEUR INF PR, V22, P342
  • [4] Daniely A, 2016, ADV NEUR IN, V29
  • [5] Daubechies I, 2022, CONSTR APPROX, V55, P127, DOI 10.1007/s00365-021-09548-z
  • [6] Neural network approximation
    DeVore, Ronald
    Hanin, Boris
    Petrova, Guergana
    [J]. ACTA NUMERICA, 2021, 30 : 327 - 444
  • [7] Exponential convergence of the deep neural network approximation for analytic functions
    E, Weinan
    Wang, Qingcan
    [J]. SCIENCE CHINA-MATHEMATICS, 2018, 61 (10) : 1733 - 1740
  • [8] Deep Neural Network Approximation Theory
    Elbrachter, Dennis
    Perekrestenko, Dmytro
    Grohs, Philipp
    Boelcskei, Helmut
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (05) : 2581 - 2623
  • [9] Theory of deep convolutional neural networks II: Spherical analysis
    Fang, Zhiying
    Feng, Han
    Huang, Shuo
    Zhou, Ding-Xuan
    [J]. NEURAL NETWORKS, 2020, 131 : 154 - 162
  • [10] CNN MODELS FOR READABILITY OF CHINESE TEXTS
    Feng, Han
    Hou, Sizai
    Wei, Le-Yin
    Zhou, Ding-Xuan
    [J]. MATHEMATICAL FOUNDATIONS OF COMPUTING, 2022, 5 (04): : 351 - 362