Fluctuation-driven initialization for spiking neural network training

被引:8
作者
Rossbroich, Julian [1 ,2 ]
Gygax, Julia [1 ,2 ]
Zenke, Friedemann [1 ,2 ]
机构
[1] Friedrich Miescher Inst Biomed Res, Basel, Switzerland
[2] Univ Basel, Fac Sci, Basel, Switzerland
来源
NEUROMORPHIC COMPUTING AND ENGINEERING | 2022年 / 2卷 / 04期
基金
瑞士国家科学基金会;
关键词
spiking neural networks; surrogate gradients; initialization strategies; recurrent neural networks; Dale's law; homeostatic plasticity; balanced networks; INTEGRATION; NEURONS; POWER; MODEL;
D O I
10.1088/2634-4386/ac97bb
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Spiking neural networks (SNNs) underlie low-power, fault-tolerant information processing in the brain and could constitute a power-efficient alternative to conventional deep neural networks when implemented on suitable neuromorphic hardware accelerators. However, instantiating SNNs that solve complex computational tasks in-silico remains a significant challenge. Surrogate gradient (SG) techniques have emerged as a standard solution for training SNNs end-to-end. Still, their success depends on synaptic weight initialization, similar to conventional artificial neural networks (ANNs). Yet, unlike in the case of ANNs, it remains elusive what constitutes a good initial state for an SNN. Here, we develop a general initialization strategy for SNNs inspired by the fluctuation-driven regime commonly observed in the brain. Specifically, we derive practical solutions for data-dependent weight initialization that ensure fluctuation-driven firing in the widely used leaky integrate-and-fire neurons. We empirically show that SNNs initialized following our strategy exhibit superior learning performance when trained with SGs. These findings generalize across several datasets and SNN architectures, including fully connected, deep convolutional, recurrent, and more biologically plausible SNNs obeying Dale's law. Thus fluctuation-driven initialization provides a practical, versatile, and easy-to-implement strategy for improving SNN training performance on diverse tasks in neuromorphic engineering and computational neuroscience.
引用
收藏
页数:25
相关论文
共 58 条
[1]   A Low Power, Fully Event-Based Gesture Recognition System [J].
Amir, Arnon ;
Taba, Brian ;
Berg, David ;
Melano, Timothy ;
McKinstry, Jeffrey ;
Di Nolfo, Carmelo ;
Nayak, Tapan ;
Andreopoulos, Alexander ;
Garreau, Guillaume ;
Mendoza, Marcela ;
Kusnitz, Jeff ;
Debole, Michael ;
Esser, Steve ;
Delbruck, Tobi ;
Flickner, Myron ;
Modha, Dharmendra .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :7388-7397
[2]   Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex [J].
Amit, DJ ;
Brunel, N .
CEREBRAL CORTEX, 1997, 7 (03) :237-252
[3]  
[Anonymous], 2015, Advances in Neural Information Processing Systems
[4]  
Bellec G, 2018, 32 C NEURAL INFORM P
[5]   Error-backpropagation in temporally encoded networks of spiking neurons [J].
Bohte, SM ;
Kok, JN ;
La Poutré, H .
NEUROCOMPUTING, 2002, 48 :17-37
[6]   A gradient descent rule for spiking neurons emitting multiple spikes [J].
Booij, O ;
Nguyen, HT .
INFORMATION PROCESSING LETTERS, 2005, 95 (06) :552-558
[7]   Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons [J].
Brunel, N .
JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2000, 8 (03) :183-208
[8]  
Bu T, 2022, arXiv
[9]   The log-dynamic brain: how skewed distributions affect network operations [J].
Buzsaki, Gyoergy ;
Mizuseki, Kenji .
NATURE REVIEWS NEUROSCIENCE, 2014, 15 (04) :264-278
[10]   Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition [J].
Cao, Yongqiang ;
Chen, Yang ;
Khosla, Deepak .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2015, 113 (01) :54-66