Prototype Analysis in Hopfield Networks With Hebbian Learning

被引:0
作者
Mcalister, Hayden [1 ]
Robins, Anthony [1 ]
Szymanski, Lech [1 ]
机构
[1] Univ Otago, Sch Comp, Dunedin 9016, New Zealand
关键词
NEURAL-NETWORKS; ASSOCIATIVE MEMORY; MODELS; PATTERNS; CAPACITY;
D O I
10.1162/neco_a_01704
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We discuss prototype formation in the Hopfield network. Typically, Hebbian learning with highly correlated states leads to degraded memory performance. We show that this type of learning can lead to prototype formation, where unlearned states emerge as representatives of large correlated subsets of states, alleviating capacity woes. This process has similarities to prototype learning in human cognition. We provide a substantial literature review of prototype learning in associative memories, covering contributions from psychology, statistical physics, and computer science. We analyze prototype formation from a theoretical perspective and derive a stability condition for these states based on the number of examples of the prototype presented for learning, the noise in those examples, and the number of nonexample states presented. The stability condition is used to construct a probability of stability for a prototype state as the factors of stability change. We also note similarities to traditional network analysis, allowing us to find a prototype capacity. We corroborate these expectations of prototype formation with experiments using a simple Hopfield network with standard Hebbian learning. We extend our experiments to a Hopfield network trained on data with multiple prototypes and find the network is capable of stabilizing multiple prototypes concurrently. We measure the basins of attraction of the multiple prototype states, finding attractor strength grows with the number of examples and the agreement of examples. We link the stability and dominance of prototype states to the energy profile of these states, particularly when comparing the profile shape to target states or other spurious states.
引用
收藏
页码:2322 / 2364
页数:43
相关论文
共 45 条
  • [1] SPIN-GLASS MODELS OF NEURAL NETWORKS
    AMIT, DJ
    GUTFREUND, H
    [J]. PHYSICAL REVIEW A, 1985, 32 (02): : 1007 - 1018
  • [2] STORING INFINITE NUMBERS OF PATTERNS IN A SPIN-GLASS MODEL OF NEURAL NETWORKS
    AMIT, DJ
    GUTFREUND, H
    SOMPOLINSKY, H
    [J]. PHYSICAL REVIEW LETTERS, 1985, 55 (14) : 1530 - 1533
  • [3] STATISTICAL-MECHANICS OF NEURAL NETWORKS NEAR SATURATION
    AMIT, DJ
    GUTFREUND, H
    SOMPOLINSKY, H
    [J]. ANNALS OF PHYSICS, 1987, 173 (01) : 30 - 67
  • [4] On the problem of spurious patterns in neural associative memory models
    Athithan, G
    Dasgupta, C
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (06): : 1483 - 1491
  • [5] Bahdanau D, 2016, Arxiv, DOI [arXiv:1409.0473, DOI 10.48550/ARXIV.1409.0473]
  • [6] Hopfield Networks for Vector Quantization
    Bauckhage, C.
    Ramamurthy, R.
    Sifa, R.
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 192 - 203
  • [7] Confidence-Controlled Hebbian Learning Efficiently Extracts Category Membership From Stimuli Encoded in View of a Categorization Task
    Berlemont, Kevin
    Nadal, Jean-Pierre
    [J]. NEURAL COMPUTATION, 2021, 34 (01) : 45 - 77
  • [8] Bovier A., 2001, Adv. Theor. Math. Phys., V5, P1001
  • [9] A STUDY ON NEURAL NETWORKS
    BRUCK, J
    SANZ, J
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 1988, 3 (01) : 59 - 75
  • [10] ON THE NUMBER OF SPURIOUS MEMORIES IN THE HOPFIELD MODEL
    BRUCK, J
    ROYCHOWDHURY, VP
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1990, 36 (02) : 393 - 397