MEAN FIELD ANALYSIS OF NEURAL NETWORKS: A LAW OF LARGE NUMBERS

被引:94
作者
Sirignano, Justin [1 ]
Spiliopoulos, Konstantinos [2 ]
机构
[1] Univ Illinois, Dept Ind & Syst Engn, Champaign, IL 61820 USA
[2] Boston Univ, Dept Math & Stat, Boston, MA 02215 USA
基金
美国国家科学基金会;
关键词
stochastic analysis; weak convergence; machine learning; APPROXIMATION;
D O I
10.1137/18M1192184
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Machine learning, and in particular neural network models, have revolutionized fields such as image, text, and speech recognition. Today, many important real-world applications in these areas are driven by neural networks. There are also growing applications in engineering, robotics, medicine, and finance. Despite their immense success in practice, there is limited mathematical understanding of neural networks. This paper illustrates how neural networks can be studied via stochastic analysis and develops approaches for addressing some of the technical challenges which arise. We analyze one-layer neural networks in the asymptotic regime of simultaneously (a) large network sizes and (b) large numbers of stochastic gradient descent training iterations. We rigorously prove that the empirical distribution of the neural network parameters converges to the solution of a nonlinear partial differential equation. This result can be considered a law of large numbers for neural networks. In addition, a consequence of our analysis is that the trained parameters of the neural network asymptotically become independent, a property which is commonly called "propagation of chaos."
引用
收藏
页码:725 / 752
页数:28
相关论文
共 50 条
  • [21] Law of Large Numbers for Random LU-Fuzzy Numbers: Some Results in the Context of Simulation of Financial Quantities
    Holcapek, Michal
    Tichy, Tomas
    PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON FINANCE AND BANKING, 2014, : 124 - 130
  • [22] A spatial stochastic epidemic model: law of large numbers and central limit theorem
    Bowong, S.
    Emakoua, A.
    Pardoux, E.
    STOCHASTICS AND PARTIAL DIFFERENTIAL EQUATIONS-ANALYSIS AND COMPUTATIONS, 2023, 11 (01): : 31 - 105
  • [23] On the Performance of Convolutional Neural Networks for Side-Channel Analysis
    Picek, Stjepan
    Samiotis, Ioannis Petros
    Kim, Jaehun
    Heuser, Annelie
    Bhasin, Shivam
    Legay, Axel
    SECURITY, PRIVACY, AND APPLIED CRYPTOGRAPHY ENGINEERING, SPACE 2018, 2018, 11348 : 157 - 176
  • [24] EQUILIBRIUM LARGE DEVIATIONS FOR MEAN-FIELD SYSTEMS WITH TRANSLATION INVARIANCE
    Reygner, Julien
    ANNALS OF APPLIED PROBABILITY, 2018, 28 (05) : 2922 - 2965
  • [25] A learning method for vector field approximation by neural networks
    Kuroe, Y
    Mitsui, M
    Kawakami, H
    Mori, T
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 2300 - 2305
  • [26] A new data assimilation method of recovering turbulent mean flow field at high Reynolds numbers
    Liu, Yilang
    Zhang, Weiwei
    Xia, Zhenhua
    AEROSPACE SCIENCE AND TECHNOLOGY, 2022, 126
  • [27] Approximation Analysis of Convolutional Neural Networks
    Bao, Chenglong
    Li, Qianxiao
    Shen, Zuowei
    Tai, Cheng
    Wu, Lei
    Xiang, Xueshuang
    EAST ASIAN JOURNAL ON APPLIED MATHEMATICS, 2023, 13 (03) : 524 - 549
  • [28] NeuralSens: Sensitivity Analysis of Neural Networks
    Pizarroso, Jaime
    Portela, Jose
    Munoz, Antonio
    JOURNAL OF STATISTICAL SOFTWARE, 2022, 102 (07): : 1 - 36
  • [29] Neural Networks for Driver Behavior Analysis
    Martinelli, Fabio
    Marulli, Fiammetta
    Mercaldo, Francesco
    Santone, Antonella
    ELECTRONICS, 2021, 10 (03) : 1 - 23
  • [30] Age Analysis with Convolutional Neural Networks
    Perez-Delgado, Maria-Luisa
    Roman-Gallego, Jesus-Angel
    NEW TRENDS IN DISRUPTIVE TECHNOLOGIES, TECH ETHICS AND ARTIFICIAL INTELLIGENCE, DITTET 2023, 2023, 1452 : 28 - 37