Large deviations and mean-field theory for asymmetric random recurrent neural networks

被引:33
|
作者
Moynot, O
Samuelides, M
机构
[1] Univ Toulouse 3, Lab Stat Probabil, F-31062 Toulouse, France
[2] Off Natl Etud & Rech Aerosp, F-31055 Toulouse, France
关键词
D O I
10.1007/s004400100182
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this article, we study the asymptotic dynamics of a noisy discrete time neural network, with random asymmetric couplings and thresholds. More precisely, we focus our interest on the limit behaviour of the network when its size grows to infinity with bounded time. In the case of gaussian connection weights, we use the same techniques as Ben Arous and Guionnet (see [3]) to prove that the image law of the distribution of the neurons' activation states by the empirical measure satisfies a temperature free large deviation principle. Moreover, we prove that if the connection weights satisfy a general condition of domination by gaussian tails, then the distribution of the activation potential of each neuron converges weakly towards an explicit gaussian law, the characteristics of which are contained in the mean-field equations stated by Cessac-Doyon-Quoy-Samuelides (see [4-6]). Furthermore, under this hypothesis, we obtain a law of large numbers and a propagation of chaos result. Finally, we show that many classical distributions on the couplings fulfill our general condition. Thus, this paper provides rigorous mean-field results for a large class of neural networks which is currently investigated in neural network literature.
引用
收藏
页码:41 / 75
页数:35
相关论文
共 50 条
  • [1] Large deviations and mean-field theory for asymmetric random recurrent neural networks
    Olivier Moynot
    Manuel Samuelides
    Probability Theory and Related Fields, 2002, 123 : 41 - 75
  • [2] Mean-field theory and synchronization in random recurrent neural networks
    Dauce, E
    Moynot, O
    Pinaud, O
    Samuelides, M
    NEURAL PROCESSING LETTERS, 2001, 14 (02) : 115 - 126
  • [3] Mean-field Theory and Synchronization in Random Recurrent Neural Networks
    Emmanuel Dauce
    Olivier Moynot
    Olivier Pinaud
    Manuel Samuelides
    Neural Processing Letters, 2001, 14 : 115 - 126
  • [4] Mean-field theory of fluid neural networks
    Delgado, J
    Sole, RV
    PHYSICAL REVIEW E, 1998, 57 (02) : 2204 - 2211
  • [5] ASYMMETRIC MEAN-FIELD NEURAL NETWORKS FOR MULTIPROCESSOR SCHEDULING
    HELLSTROM, BJ
    KANAL, LN
    NEURAL NETWORKS, 1992, 5 (04) : 671 - 686
  • [6] Dynamic mean-field theory for continuous random networks
    Zuniga-Galindo, W. A.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2025, 58 (12)
  • [7] Mean field theory for asymmetric neural networks
    Kappen, H.J.
    Spanjers, J.J.
    Physical Review E - Statistical, Nonlinear, and Soft Matter Physics, 2000, 61 (05): : 5658 - 5663
  • [8] Mean field theory for asymmetric neural networks
    Kappen, HJ
    Spanjers, JJ
    PHYSICAL REVIEW E, 2000, 61 (05): : 5658 - 5663
  • [9] Coherence Resonance in Random Erdos-Renyi Neural Networks: Mean-Field Theory
    Hutt, A.
    Wahl, T.
    Voges, N.
    Hausmann, Jo
    Lefebvre, J.
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 7
  • [10] Beyond dynamical mean-field theory of neural networks
    Massimiliano Muratori
    Bruno Cessac
    BMC Neuroscience, 14 (Suppl 1)