Temporal Effective Batch Normalization in Spiking Neural Networks

被引:0
|
作者
Duan, Chaoteng [1 ]
Ding, Jianhao [2 ]
Chen, Shiyan [1 ]
Yu, Zhaofei [3 ]
Huang, Tiejun [2 ]
机构
[1] Peking Univ, Sch Elect & Comp Engn, Beijing 100871, Peoples R China
[2] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[3] Peking Univ, Sch Comp Sci, Inst Artificial Intelligence, Beijing 100871, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
中国国家自然科学基金;
关键词
NEURONS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) are promising in neuromorphic hardware owing to utilizing spatio-temporal information and sparse event-driven signal processing. However, it is challenging to train SNNs due to the non-differentiable nature of the binary firing function. The surrogate gradients alleviate the training problem and make SNNs obtain comparable performance as Artificial Neural Networks (ANNs) with the same structure. Unfortunately, batch normalization, contributing to the success of ANNs, does not play a prominent role in SNNs because of the additional temporal dimension. To this end, we propose an effective normalization method called temporal effective batch normalization (TEBN). By rescaling the presynaptic inputs with different weights at every time-step, temporal distributions become smoother and uniform. Theoretical analysis shows that TEBN can be viewed as a smoother of SNN's optimization landscape and could help stabilize the gradient norm. Experimental results on both static and neuromorphic datasets show that SNNs with TEBN outperform the state-of-the-art accuracy with fewer time-steps, and achieve better robustness to hyper-parameters than other normalizations.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Robust Very Small Spiking Neural Networks Evolved with Noise to Recognize Temporal Patterns
    Yaqoob, Muhammad
    Wrobel, Borys
    2018 CONFERENCE ON ARTIFICIAL LIFE (ALIFE 2018), 2018, : 665 - 672
  • [22] A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks
    Wu, Jibin
    Chua, Yansong
    Zhang, Malu
    Li, Guoqi
    Li, Haizhou
    Tan, Kay Chen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (01) : 446 - 460
  • [23] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [24] Theoretically Provable Spiking Neural Networks
    Zhang, Shao-Qun
    Zhou, Zhi-Hua
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [25] Evolution and Analysis of Embodied Spiking Neural Networks Reveals Task-Specific Clusters of Effective Networks
    Vasu, Madhavun Candadai
    Izquierdo, Eduardo J.
    PROCEEDINGS OF THE 2017 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'17), 2017, : 75 - 82
  • [26] Simulating spiking neural networks on GPU
    Brette, Romain
    Goodman, Dan F. M.
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 2012, 23 (04) : 167 - 182
  • [27] The geometry of robustness in spiking neural networks
    Calaim, Nuno
    Dehmelt, Florian A.
    Goncalves, Pedro J.
    Machens, Christian K.
    ELIFE, 2022, 11
  • [28] Phase diagram of spiking neural networks
    Seyed-allaei, Hamed
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2015, 9
  • [29] Efficient learning in spiking neural networks
    Rast, Alexander
    Aoun, Mario Antoine
    Elia, Eleni G.
    Crook, Nigel
    NEUROCOMPUTING, 2024, 597
  • [30] A Review of Computing with Spiking Neural Networks
    Wu, Jiadong
    Wang, Yinan
    Li, Zhiwei
    Lu, Lun
    Li, Qingjiang
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (03): : 2909 - 2939