Exploiting noise as a resource for computation and learning in spiking neural networks

被引:12
作者
Ma, Gehua [1 ]
Yan, Rui [2 ]
Tang, Huajin [1 ,3 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
[2] Zhejiang Univ Technol, Coll Comp Sci & Technol, Hangzhou, Peoples R China
[3] Zhejiang Univ, State Key Lab Brain Machine Intelligence, Hangzhou, Peoples R China
来源
PATTERNS | 2023年 / 4卷 / 10期
基金
中国国家自然科学基金;
关键词
STOCHASTIC RESONANCE; NEURONS; MODELS; POWER; VARIABILITY; FLUCTUATIONS; INTELLIGENCE; PLASTICITY; LOIHI;
D O I
10.1016/j.patter.2023.100831
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Networks of spiking neurons underpin the extraordinary information-processing capabilities of the brain and have become pillar models in neuromorphic artificial intelligence. Despite extensive research on spiking neural networks (SNNs), most studies are established on deterministic models, overlooking the inherent nondeterministic, noisy nature of neural computations. This study introduces the noisy SNN (NSNN) and the noise-driven learning (NDL) rule by incorporating noisy neuronal dynamics to exploit the computational advantages of noisy neural processing. The NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation and learning. We demonstrate that this framework leads to spiking neural models with competitive performance, improved robustness against challenging perturbations compared with deterministic SNNs, and better reproducing probabilistic computation in neural coding. Generally, this study offers a powerful and easy-to-use tool for machine learning, neuromorphic intelligence practitioners, and computational neuroscience researchers.
引用
收藏
页数:17
相关论文
共 119 条
[1]   A Low Power, Fully Event-Based Gesture Recognition System [J].
Amir, Arnon ;
Taba, Brian ;
Berg, David ;
Melano, Timothy ;
McKinstry, Jeffrey ;
Di Nolfo, Carmelo ;
Nayak, Tapan ;
Andreopoulos, Alexander ;
Garreau, Guillaume ;
Mendoza, Marcela ;
Kusnitz, Jeff ;
Debole, Michael ;
Esser, Steve ;
Delbruck, Tobi ;
Flickner, Myron ;
Modha, Dharmendra .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :7388-7397
[2]  
[Anonymous], Technical report
[3]  
Arribas Diego, 2020, Adv. Neural. Inf. Process. Syst, V33, P2293, DOI DOI 10.5555/3495724.3495917
[4]   Non-Gaussian Ornstein-Uhlenbeck-based models and some of their uses in financial economics [J].
Barndorff-Nielsen, OE ;
Shephard, N .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2001, 63 :167-207
[5]   When response variability increases neural network robustness to synaptic noise [J].
Basalyga, Gleb ;
Salinas, Emilio .
NEURAL COMPUTATION, 2006, 18 (06) :1349-1379
[6]   A solution to the learning dilemma for recurrent networks of spiking neurons [J].
Bellec, Guillaume ;
Scherr, Franz ;
Subramoney, Anand ;
Hajek, Elias ;
Salaj, Darjan ;
Legenstein, Robert ;
Maass, Wolfgang .
NATURE COMMUNICATIONS, 2020, 11 (01)
[7]   Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations [J].
Benjamin, Ben Varkey ;
Gao, Peiran ;
McQuinn, Emmett ;
Choudhary, Swadesh ;
Chandrasekaran, Anand R. ;
Bussat, Jean-Marie ;
Alvarez-Icaza, Rodrigo ;
Arthur, John V. ;
Merolla, Paul A. ;
Boahen, Kwabena .
PROCEEDINGS OF THE IEEE, 2014, 102 (05) :699-716
[8]   Lapicque's 1907 paper: from frogs to integrate-and-fire [J].
Brunel, Nicolas ;
van Rossum, Mark C. W. .
BIOLOGICAL CYBERNETICS, 2007, 97 (5-6) :337-339
[9]   A review of the integrate-and-fire neuron model: I. Homogeneous synaptic input [J].
Burkitt, A. N. .
BIOLOGICAL CYBERNETICS, 2006, 95 (01) :1-19
[10]   CONDITIONAL MONTE CARLO - SIMULATION TECHNIQUE FOR STOCHASTIC NETWORK ANALYSIS [J].
BURT, JM ;
GARMAN, MB .
MANAGEMENT SCIENCE SERIES A-THEORY, 1971, 18 (03) :207-217