Memory-Dependent Computation and Learning in Spiking Neural Networks Through Hebbian Plasticity

被引:1
|
作者
Limbacher, Thomas [1 ]
Ozdenizci, Ozan [2 ,3 ]
Legenstein, Robert [1 ]
机构
[1] Graz Univ Technol, Fac Comp Sci & Biomed Engn, A-8010 Graz, Austria
[2] Graz Univ Technol, Fac Comp Sci & Biomed Engn, A-8010 Graz, Austria
[3] Silicon Austria Labs, TU Graz SAL Dependable Embedded Syst Lab, A-8010 Graz, Austria
关键词
Few-shot learning; Hebbian plasticity; memory; spiking neural networks (SNNs); MODEL; POTENTIATION; ARCHITECTURE; NEUROSCIENCE;
D O I
10.1109/TNNLS.2023.3341446
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are the basis for many energy-efficient neuromorphic hardware systems. While there has been substantial progress in SNN research, artificial SNNs still lack many capabilities of their biological counterparts. In biological neural systems, memory is a key component that enables the retention of information over a huge range of temporal scales, ranging from hundreds of milliseconds up to years. While Hebbian plasticity is believed to play a pivotal role in biological memory, it has so far been analyzed mostly in the context of pattern completion and unsupervised learning in artificial and SNNs. Here, we propose that Hebbian plasticity is fundamental for computations in biological and artificial spiking neural systems. We introduce a novel memory-augmented SNN architecture that is enriched by Hebbian synaptic plasticity. We show that Hebbian enrichment renders SNNs surprisingly versatile in terms of their computational as well as learning capabilities. It improves their abilities for out-of-distribution generalization, one-shot learning, cross-modal generative association, language processing, and reward-based learning. This suggests that powerful cognitive neuromorphic systems can be built based on this principle.
引用
收藏
页码:2551 / 2562
页数:12
相关论文
共 50 条
  • [41] Memory-efficient neurons and synapses for spike-timing-dependent-plasticity in large-scale spiking networks
    Urbizagastegui, Pablo
    van Schaik, Andre
    Wang, Runchun
    FRONTIERS IN NEUROSCIENCE, 2024, 18
  • [42] Concept learning through deep reinforcement learning with memory-augmented neural networks
    Shi, Jing
    Xu, Jiaming
    Yao, Yiqun
    Xu, Bo
    NEURAL NETWORKS, 2019, 110 : 47 - 54
  • [43] Stable memory and computation in randomly rewiring neural networks
    Acker, Daniel
    Paradis, Suzanne
    Miller, Paul
    JOURNAL OF NEUROPHYSIOLOGY, 2019, 122 (01) : 66 - 80
  • [44] Smooth Exact Gradient Descent Learning in Spiking Neural Networks
    Klos, Christian
    Memmesheimer, Raoul-Martin
    PHYSICAL REVIEW LETTERS, 2025, 134 (02)
  • [45] A Curiosity-Based Learning Method for Spiking Neural Networks
    Shi, Mengting
    Zhang, Tielin
    Zeng, Yi
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14
  • [46] Advancing Spiking Neural Networks Toward Deep Residual Learning
    Hu, Yifan
    Deng, Lei
    Wu, Yujie
    Yao, Man
    Li, Guoqi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 2353 - 2367
  • [47] Convolutional Neural Networks with Hebbian-Based Rules in Online Transfer Learning
    Aguilar Canto, Fernando Javier
    ADVANCES IN SOFT COMPUTING, MICAI 2020, PT I, 2020, 12468 : 35 - 49
  • [48] Supervised learning in spiking, neural networks with noise-threshold
    Zhang, Malu
    Qu, Hong
    Xie, Xiurui
    Kurths, Juergen
    NEUROCOMPUTING, 2017, 219 : 333 - 349
  • [49] Matching Recall and Storage in Sequence Learning with Spiking Neural Networks
    Brea, Johanni
    Senn, Walter
    Pfister, Jean-Pascal
    JOURNAL OF NEUROSCIENCE, 2013, 33 (23) : 9565 - 9575
  • [50] Reinforcement Learning in Spiking Neural Networks with Stochastic and Deterministic Synapses
    Yuan, Mengwen
    Wu, Xi
    Yan, Rui
    Tang, Huajin
    NEURAL COMPUTATION, 2019, 31 (12) : 2368 - 2389