Temporal Conditioning Spiking Latent Variable Models of the Neural Response to Natural Visual Scenes

被引:0
作者
Ma, Gehua [1 ]
Jiang, Runhao [1 ]
Yan, Rui [1 ]
Tang, Huajin [1 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023) | 2023年
基金
中国国家自然科学基金;
关键词
INFORMATION; NOISE; NETWORKS; INTEGRATION; ADAPTATION; CONTRAST; RETINA; TRAINS; SPACE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Developing computational models of neural response is crucial for understanding sensory processing and neural computations. Current state-of-the-art neural network methods use temporal filters to handle temporal dependencies, resulting in an unrealistic and inflexible processing paradigm. Meanwhile, these methods target trial-averaged firing rates and fail to capture important features in spike trains. This work presents the temporal conditioning spiking latent variable models (TeCoS-LVM) to simulate the neural response to natural visual stimuli. We use spiking neurons to produce spike outputs that directly match the recorded trains. This approach helps to avoid losing information embedded in the original spike trains. We exclude the temporal dimension from the model parameter space and introduce a temporal conditioning operation to allow the model to adaptively explore and exploit temporal dependencies in stimuli sequences in a natural paradigm. We show that TeCoS-LVM models can produce more realistic spike activities and accurately fit spike statistics than powerful alternatives. Additionally, learned TeCoS-LVM models can generalize well to longer time scales. Overall, while remaining computationally tractable, our model effectively captures key features of neural coding systems. It thus provides a useful tool for building accurate predictive computational accounts for various sensory perception circuits.
引用
收藏
页数:22
相关论文
共 86 条
[51]  
McIntosh Lane, 2016, NEURIPS
[52]   The neural code of the retina [J].
Meister, M ;
Berry, MJ .
NEURON, 1999, 22 (03) :435-450
[53]   MULTI-NEURONAL SIGNALS FROM THE RETINA - ACQUISITION AND ANALYSIS [J].
MEISTER, M ;
PINE, J ;
BAYLOR, DA .
JOURNAL OF NEUROSCIENCE METHODS, 1994, 51 (01) :95-106
[54]   Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks [J].
Neftci, Emre O. ;
Mostafa, Hesham ;
Zenke, Friedemann .
IEEE SIGNAL PROCESSING MAGAZINE, 2019, 36 (06) :51-63
[55]  
Nichol A, 2021, PR MACH LEARN RES, V139
[56]   Coding of cognitive magnitude: Compressed scaling of numerical information in the primate prefrontal cortex [J].
Nieder, A ;
Miller, EK .
NEURON, 2003, 37 (01) :149-157
[57]   Segregation of object and background motion in the retina [J].
Ölveczky, BP ;
Baccus, SA ;
Meister, M .
NATURE, 2003, 423 (6938) :401-408
[58]   Using Matrix and Tensor Factorizations for the Single-Trial Analysis of Population Spike Trains [J].
Onken, Arno ;
Liu, Jian K. ;
Karunasekara, P. P. Chamanthi R. ;
Delis, Ioannis ;
Gollisch, Tim ;
Panzeri, Stefano .
PLOS COMPUTATIONAL BIOLOGY, 2016, 12 (11)
[59]   Inferring single-trial neural population dynamics using sequential auto-encoders [J].
Pandarinath, Chethan ;
O'Shea, Daniel J. ;
Collins, Jasmine ;
Jozefowicz, Rafal ;
Stavisky, Sergey D. ;
Kao, Jonathan C. ;
Trautmann, Eric M. ;
Kaufman, Matthew T. ;
Ryu, Stephen I. ;
Hochberg, Leigh R. ;
Henderson, Jaimie M. ;
Shenoy, Krishna V. ;
Abbott, L. F. ;
Sussillo, David .
NATURE METHODS, 2018, 15 (10) :805-+
[60]   Strictly Positive-Definite Spike Train Kernels for Point-Process Divergences [J].
Park, Il Memming ;
Seth, Sohan ;
Rao, Murali ;
Principe, Jose C. .
NEURAL COMPUTATION, 2012, 24 (08) :2223-2250