MEAN FIELD ANALYSIS OF NEURAL NETWORKS: A LAW OF LARGE NUMBERS

被引:94
|
作者
Sirignano, Justin [1 ]
Spiliopoulos, Konstantinos [2 ]
机构
[1] Univ Illinois, Dept Ind & Syst Engn, Champaign, IL 61820 USA
[2] Boston Univ, Dept Math & Stat, Boston, MA 02215 USA
基金
美国国家科学基金会;
关键词
stochastic analysis; weak convergence; machine learning; APPROXIMATION;
D O I
10.1137/18M1192184
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Machine learning, and in particular neural network models, have revolutionized fields such as image, text, and speech recognition. Today, many important real-world applications in these areas are driven by neural networks. There are also growing applications in engineering, robotics, medicine, and finance. Despite their immense success in practice, there is limited mathematical understanding of neural networks. This paper illustrates how neural networks can be studied via stochastic analysis and develops approaches for addressing some of the technical challenges which arise. We analyze one-layer neural networks in the asymptotic regime of simultaneously (a) large network sizes and (b) large numbers of stochastic gradient descent training iterations. We rigorously prove that the empirical distribution of the neural network parameters converges to the solution of a nonlinear partial differential equation. This result can be considered a law of large numbers for neural networks. In addition, a consequence of our analysis is that the trained parameters of the neural network asymptotically become independent, a property which is commonly called "propagation of chaos."
引用
收藏
页码:725 / 752
页数:28
相关论文
共 50 条
  • [1] Mean Field Analysis of Deep Neural Networks
    Sirignano, Justin
    Spiliopoulos, Konstantinos
    MATHEMATICS OF OPERATIONS RESEARCH, 2022, 47 (01) : 120 - 152
  • [2] Mean-field inference methods for neural networks
    Gabrie, Marylou
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2020, 53 (22)
  • [3] Learning phase field mean curvature flows with neural networks
    Bretin, Elie
    Denis, Roland
    Masnou, Simon
    Terii, Garry
    JOURNAL OF COMPUTATIONAL PHYSICS, 2022, 470
  • [4] On laws of large numbers for systems with mean-field interactions and Markovian switching
    Nguyen, Son L.
    Yin, George
    Hoang, Tuan A.
    STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2020, 130 (0I) : 262 - 296
  • [5] Mean-field Langevin dynamics and energy landscape of neural networks
    Hu, Kaitong
    Ren, Zhenjie
    Siska, David
    Szpruch, Lukasz
    ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2021, 57 (04): : 2043 - 2065
  • [6] Mean-field theory of graph neural networks in graph partitioning
    Kawamoto, Tatsuro
    Tsubaki, Masashi
    Obuchi, Tomoyuki
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [7] Law of Large Numbers and Central Limit Theorem for Wide Two-layer Neural Networks: The Mini-Batch and Noisy Case
    Descours, Arnaud
    Guillin, Arnaud
    Michel, Manon
    Nectoux, Boris
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [8] Dynamics of finite width Kernel and prediction fluctuations in mean field neural networks
    Bordelon, Blake
    Pehlevan, Cengiz
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2024, 2024 (10):
  • [9] From Optimal Control to Mean Field Optimal Transport via Stochastic Neural Networks
    Di Persio, Luca
    Garbelli, Matteo
    SYMMETRY-BASEL, 2023, 15 (09):
  • [10] A WEAK LAW OF LARGE NUMBERS FOR DEPENDENT RANDOM VARIABLES
    Karatzas, I.
    Schachermayer, W.
    THEORY OF PROBABILITY AND ITS APPLICATIONS, 2023, 68 (03) : 501 - 509