Optimizing one-shot learning with binary Synapses

被引:16
作者
Romani, Sandro [1 ]
Amit, Daniel J. [3 ,4 ]
Amit, Yali [2 ]
机构
[1] Univ Roma La Sapienza, I-00185 Rome, Italy
[2] Dept Stat & Comp Sci, Chicago, IL 60637 USA
[3] Hebrew Univ Jerusalem, Racah Inst Phys, IL-91904 Jerusalem, Israel
[4] Univ Rome, INFM, I-00185 Rome, Italy
关键词
D O I
10.1162/neco.2008.10-07-618
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A network of excitatory synapses trained with a conservative version of Hebbian learning is used as a model for recognizing the familiarity of thousands of once-seen stimuli from those never seen before. Such networks were initially proposed for modeling memory retrieval (selective delay activity). We show that the same framework allows the incorporation of both familiarity recognition and memory retrieval, and estimate the network's capacity. In the case of binary neurons, we extend the analysis of Amit and Fusi (1994) to obtain capacity limits based on computations of signal-to-noise ratio of the field difference between selective and non-selective neurons of learned signals. We show that with fast learning (potentiation probability approximately 1), the most recently learned patterns can be retrieved in working memory (selective delay activity). A much higher number of once-seen learned patterns elicit a realistic familiarity signal in the presence of an external field. With potentiation probability much less than 1 (slow learning), memory retrieval disappears, whereas familiarity recognition capacity is maintained at a similarly high level. This analysis is corroborated in simulations. For analog neurons, where such analysis is more difficult, we simplify the capacity analysis by studying the excess number of potentiated synapses above the steady-state distribution. In this framework, we derive the optimal constraint between potentiation and depression probabilities that maximizes the capacity.
引用
收藏
页码:1928 / 1950
页数:23
相关论文
共 33 条
[11]   STORAGE CAPACITY OF NEURAL NETWORKS - EFFECT OF THE FLUCTUATIONS OF THE NUMBER OF ACTIVE NEURONS PER MEMORY [J].
BRUNEL, N .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1994, 27 (14) :4783-4789
[12]   Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network [J].
Brunel, N ;
Carusi, F ;
Fusi, S .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1998, 9 (01) :123-152
[13]   Modelling the formation of working memory with networks of integrate-and-fire neurons connected by plastic synapses [J].
Del Giudice, P ;
Fusi, S ;
Mattia, M .
JOURNAL OF PHYSIOLOGY-PARIS, 2003, 97 (4-6) :659-681
[14]   Spike-driven synaptic plasticity: Theory, simulation, VLSI implementation [J].
Fusi, S ;
Annunziato, M ;
Badoni, D ;
Salamon, A ;
Amit, DJ .
NEURAL COMPUTATION, 2000, 12 (10) :2227-2258
[15]   Limits on the memory storage capacity of bounded synapses [J].
Fusi, Stefano ;
Abbott, L. F. .
NATURE NEUROSCIENCE, 2007, 10 (04) :485-493
[16]   NEURONS WITH GRADED RESPONSE HAVE COLLECTIVE COMPUTATIONAL PROPERTIES LIKE THOSE OF 2-STATE NEURONS [J].
HOPFIELD, JJ .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA-BIOLOGICAL SCIENCES, 1984, 81 (10) :3088-3092
[17]   Neuroscience - Memory - a century of consolidation [J].
McGaugh, JL .
SCIENCE, 2000, 287 (5451) :248-251
[18]   Learning in realistic networks of spiking neurons and spike-driven plastic synapses [J].
Mongillo, G ;
Curti, E ;
Romani, S ;
Amit, DJ .
EUROPEAN JOURNAL OF NEUROSCIENCE, 2005, 21 (11) :3143-3160
[19]   Information storage in sparsely coded memory nets [J].
Nadal, Jean-Pierre ;
Toulouse, Gerard .
NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1990, 1 (01) :61-74
[20]   ASSOCIATIVE MEMORY - ON THE (PUZZLING) SPARSE CODING LIMIT [J].
NADAL, JP .
JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1991, 24 (05) :1093-1101