Efficient parameter calibration and real-time simulation of large-scale spiking neural networks with GeNN and NEST

被引:1
作者
Schmitt, Felix Johannes [1 ]
Rostami, Vahid [1 ]
Nawrot, Martin Paul [1 ]
机构
[1] Univ Cologne, Inst Zool, Computat Syst Neurosci, Cologne, Germany
关键词
computational neuroscience; attractor neural network; metastability; real-time simulation; computational neuroethology; spiking neural network (SNN); NERVOUS-SYSTEM; NEURONS; DYNAMICS; CONNECTIVITY; DIVERSITY; MEMORY; BRAIN; MODEL; INTEGRATION; CIRCUITS;
D O I
10.3389/fninf.2023.941696
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Spiking neural networks (SNNs) represent the state-of-the-art approach to the biologically realistic modeling of nervous system function. The systematic calibration for multiple free model parameters is necessary to achieve robust network function and demands high computing power and large memory resources. Special requirements arise from closed-loop model simulation in virtual environments and from real-time simulation in robotic application. Here, we compare two complementary approaches to efficient large-scale and real-time SNN simulation. The widely used NEural Simulation Tool (NEST) parallelizes simulation across multiple CPU cores. The GPU-enhanced Neural Network (GeNN) simulator uses the highly parallel GPU-based architecture to gain simulation speed. We quantify fixed and variable simulation costs on single machines with different hardware configurations. As a benchmark model, we use a spiking cortical attractor network with a topology of densely connected excitatory and inhibitory neuron clusters with homogeneous or distributed synaptic time constants and in comparison to the random balanced network. We show that simulation time scales linearly with the simulated biological model time and, for large networks, approximately linearly with the model size as dominated by the number of synaptic connections. Additional fixed costs with GeNN are almost independent of model size, while fixed costs with NEST increase linearly with model size. We demonstrate how GeNN can be used for simulating networks with up to 3.5 center dot 10(6) neurons (> 3 center dot 10(12)synapses) on a high-end GPU, and up to 250, 000 neurons (25 center dot 10(9) synapses) on a low-cost GPU. Real-time simulation was achieved for networks with 100, 000 neurons. Network calibration and parameter grid search can be efficiently achieved using batch processing. We discuss the advantages and disadvantages of both approaches for different use cases.
引用
收藏
页数:18
相关论文
共 132 条
[1]  
Abeles M., 1991, CORTICONICS NEURAL C, DOI [DOI 10.1017/CBO9780511574566, 10.1017/CBO9780511574566]
[2]   A Modular Workflow for Performance Benchmarking of Neuronal Network Simulations [J].
Albers, Jasper ;
Pronold, Jari ;
Kurth, Anno Christopher ;
Vennemo, Stine Brekke ;
Haghighi Mood, Kaveh ;
Patronis, Alexander ;
Terhorst, Dennis ;
Jordan, Jakob ;
Kunkel, Susanne ;
Tetzlaff, Tom ;
Diesmann, Markus ;
Senk, Johanna .
FRONTIERS IN NEUROINFORMATICS, 2022, 16
[3]   Model of global spontaneous activity and local structured activity during delay periods in the cerebral cortex [J].
Amit, DJ ;
Brunel, N .
CEREBRAL CORTEX, 1997, 7 (03) :237-252
[4]   Neural circuit mechanisms of hierarchical sequence learning tested on large-scale recording data [J].
Asabuki, Toshitake ;
Kokate, Prajakta J. ;
Fukai, Tomoki .
PLOS COMPUTATIONAL BIOLOGY, 2022, 18 (06)
[5]  
Bartolozzi C, 2022, NAT COMMUN, V13, DOI 10.1038/s41467-022-28487-2
[6]   Nengo: a Python']Python tool for building large-scale functional brain models [J].
Bekolay, Trevor ;
Bergstra, James ;
Hunsberger, Eric ;
DeWolf, Travis ;
Stewart, Terrence C. ;
Rasmussen, Daniel ;
Choo, Xuan ;
Voelker, Aaron Russell ;
Eliasmith, Chris .
FRONTIERS IN NEUROINFORMATICS, 2014, 7
[7]   NeuroGPU: Accelerating multi-compartment, biophysically detailed neuron simulations on GPUs [J].
Ben-Shalom, Roy ;
Ladd, Alexander ;
Artherya, Nikhil S. ;
Cross, Christopher ;
Kim, Kyung Geun ;
Sanghevi, Hersh ;
Korngreen, Alon ;
Bouchard, Kristofer E. ;
Bender, Kevin J. .
JOURNAL OF NEUROSCIENCE METHODS, 2022, 366
[8]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[9]   Code Generation in Computational Neuroscience: A Review of Tools and Techniques [J].
Blundell, Inga ;
Brette, Romain ;
Cleland, Thomas A. ;
Close, Thomas G. ;
Coca, Daniel ;
Davison, Andrew P. ;
Diaz-Pier, Sandra ;
Musoles, Carlos Fernandez ;
Gleeson, Padraig ;
Goodman, Dan F. M. ;
Hines, Michael ;
Hopkins, Michael W. ;
Kumbhar, Pramod ;
Lester, David R. ;
Marin, Boris ;
Morrison, Abigail ;
Mueller, Eric ;
Nowotny, Thomas ;
Peyser, Alexander ;
Plotnikov, Dimitri ;
Richmond, Paul ;
Rowley, Andrew ;
Rumpe, Bernhard ;
Stimberg, Marcel ;
Stokes, Alan B. ;
Tomkins, Adam ;
Trensch, Guido ;
Woodman, Marmaduke ;
Eppler, Jochen Martin .
FRONTIERS IN NEUROINFORMATICS, 2018, 12
[10]   Beyond the cortical column: abundance and physiology of horizontal connections imply a strong role for inputs from the surround [J].
Boucsein, Clemens ;
Nawrot, Martin P. ;
Schnepel, Philipp ;
Aertsen, Ad .
FRONTIERS IN NEUROSCIENCE, 2011, 5