SELF-CONSISTENT SIGNAL-TO-NOISE ANALYSIS OF THE STATISTICAL BEHAVIOR OF ANALOG NEURAL NETWORKS AND ENHANCEMENT OF THE STORAGE CAPACITY

被引:85
作者
SHIINO, M [1 ]
FUKAI, T [1 ]
机构
[1] TOKAI UNIV, DEPT ELECTR, HIRATSUKA, KANAGAWA 25912, JAPAN
来源
PHYSICAL REVIEW E | 1993年 / 48卷 / 02期
关键词
D O I
10.1103/PhysRevE.48.867
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
Based on the self-consistent signal-to-noise analysis (SCSNA) capable of dealing with analog neural networks with a wide class of transfer functions, enhancement of the storage capacity of associative memory and the related statistical properties of neural networks are studied for random memory patterns. Two types of transfer functions with the threshold parameter theta are considered, which are derived from the sigmoidal one to represent the output of three-state neurons. Neural networks having a monotonically increasing transfer function F(M), F(M)(u) = sgnu (Absolute value of u > theta), F(M)(u) = 0 (Absolute value of u less-than-or-equal-to theta), are shown to make it impossible for the spin-glass state to coexist with retrieval states in a certain parameter region of theta and alpha (loading rate of memory patterns), implying the reduction of the number of spurious states. The behavior of the storage capacity with changing 0 is qualitatively the same as that of the Ising spin neural networks with varying temperature. On the other hand, the nonmonotonic transfer function F(NM), F(NM)(u) = sgnu (Absolute value of u < theta), F(NM)(u) = 0 (Absolute value of u greater-than-or-equal-to theta) gives rise to remarkable features in several respects. First, it yields a large enhancement of the storage capacity compared with the Amit-Gutfreund-Sompolinsky (AGS) value: with decreasing theta from theta = infinity, the storage capacity alpha(c) of such a network is increased from the AGS value (almost-equal-to 0.14) to attain its maximum value of almost-equal-to 0.42 at theta congruent-to 0.7 and afterwards is decreased to vanish at theta = 0. Whereas for theta greater than or similar to 1 the storage capacity alpha(c) coincides with the value alpha(c) determined by the SCSNA as the upper bound of a ensuring the existence of retrieval solutions, for theta less than or similar to 1 the alpha(c) is shown to differ from the alpha(c) with the result that the retrieval solutions claimed by the SCSNA are unstable for alpha(c) < alpha < alpha(c). Second, in the case of theta < 1 the network can exhibit a new type of phase which appears as a result of a phase transition with respect to the non-Gaussian distribution of the local fields of neurons: the standard type of retrieval state with r not-equal 0 (i.e., finite width of the local field distribution), which is implied by the order-parameter equations of the SCSNA, disappears at a certain critical loading rate alpha0, and for alpha less-than-or-equal-to 0 a qualitatively different type of retrieval state comes into existence in which the width of the local field distribution vanishes (i.e., r = 0+). As a consequence, memory retrieval without errors becomes possible even in the saturation limit alpha not-equal 0. Results of the computer simulations on the statistical properties of the novel phase with alpha less-than-or-equal-to alpha0 are shown to be in satisfactory agreement with the theoretical results. The effect of introducing self-couplings on the storage capacity is also analyzed for the two types of networks. It is conspicuous for the networks with F(NM), where the self-couplings increase the stability of the retrieval solutions of the SCSNA with small values of theta, leading to a remarkable enhancement of the storage capacity.
引用
收藏
页码:867 / 897
页数:31
相关论文
共 41 条
[1]   CHARACTERISTICS OF SPARSELY ENCODED ASSOCIATIVE MEMORY [J].
AMARI, S .
NEURAL NETWORKS, 1989, 2 (06) :451-457
[2]   STATISTICAL NEURODYNAMICS OF ASSOCIATIVE MEMORY [J].
AMARI, S ;
MAGINU, K .
NEURAL NETWORKS, 1988, 1 (01) :63-73
[3]  
Amit D. J., 1989, MODELLING BRAIN FUNC
[4]   STORING INFINITE NUMBERS OF PATTERNS IN A SPIN-GLASS MODEL OF NEURAL NETWORKS [J].
AMIT, DJ ;
GUTFREUND, H ;
SOMPOLINSKY, H .
PHYSICAL REVIEW LETTERS, 1985, 55 (14) :1530-1533
[5]   STATISTICAL-MECHANICS OF NEURAL NETWORKS NEAR SATURATION [J].
AMIT, DJ ;
GUTFREUND, H ;
SOMPOLINSKY, H .
ANNALS OF PHYSICS, 1987, 173 (01) :30-67
[6]   IMAGE EVOLUTION IN HOPFIELD NETWORKS [J].
COOLEN, ACC ;
RUIJGROK, TW .
PHYSICAL REVIEW A, 1988, 38 (08) :4253-4255
[7]   LAYERED NEURAL NETWORKS [J].
DOMANY, E ;
KINZEL, W ;
MEIR, R .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (12) :2081-2102
[8]   ASYMMETRIC NEURAL NETWORKS INCORPORATING THE DALE HYPOTHESIS AND NOISE-DRIVEN CHAOS [J].
FUKAI, T ;
SHIINO, M .
PHYSICAL REVIEW LETTERS, 1990, 64 (12) :1465-1468
[9]   COMPARATIVE-STUDY OF SPURIOUS-STATE DISTRIBUTION IN ANALOG NEURAL NETWORKS AND THE BOLTZMANN MACHINE [J].
FUKAI, T ;
SHIINO, M .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1992, 25 (10) :2873-2887
[10]   LARGE SUPPRESSION OF SPURIOUS STATES IN NEURAL NETWORKS OF NONLINEAR ANALOG NEURONS [J].
FUKAI, T ;
SHIINO, M .
PHYSICAL REVIEW A, 1990, 42 (12) :7459-7466