Adaptive spiking neuron with population coding for a residual spiking neural network

被引:0
作者
Dan, Yongping [1 ]
Sun, Changhao [1 ]
Li, Hengyi [2 ,3 ]
Meng, Lin [4 ]
机构
[1] Zhongyuan Univ Technol, Sch Integrated Circuits, 1 Huaihe Rd, Zhengzhou 451191, Henan, Peoples R China
[2] Zhongyuan Univ Technol, Sch Automat & Elect Engn, 1 Huaihe Rd, Zhengzhou 451191, Henan, Peoples R China
[3] Ritsumeikan Univ, Res Org Sci & Technol, 1-1-1 Noji Higashi, Kusatsu, Shiga 5258577, Japan
[4] Ritsumeikan Univ, Coll Sci & Engn, 1-1-1 Noji Higashi, Kusatsu, Shiga 5258577, Japan
关键词
Spiking neural networks; Spiking neuron; Population coding; INFORMATION;
D O I
10.1007/s10489-024-06128-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) have attracted significant research attention due to their inherent sparsity and event-driven processing capabilities. Recent studies indicate that the incorporation of convolutional and residual structures into SNNs can substantially enhance performance. However, these converted spiking residual structures are associated with increased complexity and stacked parameterized spiking neurons. To address this challenge, this paper proposes a meticulously refined two-layer decision structure for residual-based SNNs, consisting solely of fully connected and spiking neuron layers. Specifically, the spiking neuron layers incorporate an innovative dynamic leaky integrate-and-fire (DLIF) neuron model with a nonlinear self-feedback mechanism, characterized by dynamic threshold adjustment and a self-regulating firing rate. Furthermore, diverging from traditional direct encoding, which focuses solely on individual neuronal frequency, we introduce a novel mixed coding mechanism that combines direct encoding with multineuronal population decoding. The proposed architecture improves the adaptability and responsiveness of spiking neurons in various computational contexts. Experimental results demonstrate the superior efficacy of our approach. Although it uses a highly simplified structure with only 6 timesteps, our proposal achieves enhanced performance in the experimental trials compared to multiple state-of-the-art methods. Specifically, it achieves accuracy improvements of 0.01-1.99% on three static datasets and of 0.14-7.50% on three N-datasets. The DLIF model excels in information processing, showing double mutual information compared to other neurons. In the sequential MNIST dataset, it balances biological realism and practicality, enhancing memory and the dynamic range. Our proposed method not only offers improved computational efficacy and simplified network structure but also enhances the biological plausibility of SNN models and can be easily adapted to other deep SNNs.
引用
收藏
页数:24
相关论文
共 98 条
[1]   Lapicque's introduction of the integrate-and-fire model neuron (1907) [J].
Abbott, LF .
BRAIN RESEARCH BULLETIN, 1999, 50 (5-6) :303-304
[2]   A TIME ENCODING APPROACH TO TRAINING SPIKING NEURAL NETWORKS [J].
Adam, Karen .
2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, :5957-5961
[3]   True North: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip [J].
Akopyan, Filipp ;
Sawada, Jun ;
Cassidy, Andrew ;
Alvarez-Icaza, Rodrigo ;
Arthur, John ;
Merolla, Paul ;
Imam, Nabil ;
Nakamura, Yutaka ;
Datta, Pallab ;
Nam, Gi-Joon ;
Taba, Brian ;
Beakes, Michael ;
Brezzo, Bernard ;
Kuang, Jente B. ;
Manohar, Rajit ;
Risk, William P. ;
Jackson, Bryan ;
Modha, Dharmendra S. .
IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2015, 34 (10) :1537-1557
[4]   A Low Power, Fully Event-Based Gesture Recognition System [J].
Amir, Arnon ;
Taba, Brian ;
Berg, David ;
Melano, Timothy ;
McKinstry, Jeffrey ;
Di Nolfo, Carmelo ;
Nayak, Tapan ;
Andreopoulos, Alexander ;
Garreau, Guillaume ;
Mendoza, Marcela ;
Kusnitz, Jeff ;
Debole, Michael ;
Esser, Steve ;
Delbruck, Tobi ;
Flickner, Myron ;
Modha, Dharmendra .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :7388-7397
[5]  
Belghazi MI, 2018, PR MACH LEARN RES, V80
[6]   Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type [J].
Bi, GQ ;
Poo, MM .
JOURNAL OF NEUROSCIENCE, 1998, 18 (24) :10464-10472
[7]   Spike timing-dependent plasticity: A Hebbian learning rule [J].
Caporale, Natalia ;
Dan, Yang .
ANNUAL REVIEW OF NEUROSCIENCE, 2008, 31 :25-46
[8]  
Chakraborty B, 2023, Arxiv, DOI arXiv:2302.11618
[9]  
Chen G., 2023, arXiv, DOI [arXiv:2301.11929, 10.48550/arXiv.2301.11929, DOI 10.48550/ARXIV.2301.11929]
[10]   Spike Latency Coding in Biologically Inspired Microelectronic Nose [J].
Chen, Hung Tat ;
Ng, Kwan Ting ;
Bermak, Amine ;
Law, Man Kay ;
Martinez, Dominique .
IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2011, 5 (02) :160-168