Temporal Effective Batch Normalization in Spiking Neural Networks

被引:0
|
作者
Duan, Chaoteng [1 ]
Ding, Jianhao [2 ]
Chen, Shiyan [1 ]
Yu, Zhaofei [3 ]
Huang, Tiejun [2 ]
机构
[1] Peking Univ, Sch Elect & Comp Engn, Beijing 100871, Peoples R China
[2] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[3] Peking Univ, Sch Comp Sci, Inst Artificial Intelligence, Beijing 100871, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
中国国家自然科学基金;
关键词
NEURONS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) are promising in neuromorphic hardware owing to utilizing spatio-temporal information and sparse event-driven signal processing. However, it is challenging to train SNNs due to the non-differentiable nature of the binary firing function. The surrogate gradients alleviate the training problem and make SNNs obtain comparable performance as Artificial Neural Networks (ANNs) with the same structure. Unfortunately, batch normalization, contributing to the success of ANNs, does not play a prominent role in SNNs because of the additional temporal dimension. To this end, we propose an effective normalization method called temporal effective batch normalization (TEBN). By rescaling the presynaptic inputs with different weights at every time-step, temporal distributions become smoother and uniform. Theoretical analysis shows that TEBN can be viewed as a smoother of SNN's optimization landscape and could help stabilize the gradient norm. Experimental results on both static and neuromorphic datasets show that SNNs with TEBN outperform the state-of-the-art accuracy with fewer time-steps, and achieve better robustness to hyper-parameters than other normalizations.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Spiking Neural Networks Trained via Proxy
    Kheradpisheh, Saeed Reza
    Mirsadeghi, Maryam
    Masquelier, Timothee
    IEEE ACCESS, 2022, 10 : 70769 - 70778
  • [42] Smart Hardware Implementation of Spiking Neural Networks
    Galan-Prado, Fabio
    Rossello, Josep L.
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2017, PT I, 2017, 10305 : 560 - 568
  • [43] Dataset assembly for training Spiking Neural Networks
    Baietto, Anthony
    Stewart, Christopher
    Bihl, Trevor J.
    NEUROCOMPUTING, 2025, 622
  • [44] Mastering the Output Frequency in Spiking Neural Networks
    Falez, Pierre
    Tirilly, Pierre
    Bilasco, Ioan Marius
    Devienne, Philippe
    Boulet, Pierre
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [45] A hierarchical taxonomic survey of spiking neural networks
    Wang, Siqi
    Cheng, Tee Hiang
    Lim, Meng Hiot
    MEMETIC COMPUTING, 2022, 14 (03) : 335 - 354
  • [46] Learning rules in spiking neural networks: A survey
    Yi, Zexiang
    Lian, Jing
    Liu, Qidong
    Zhu, Hegui
    Liang, Dong
    Liu, Jizhao
    NEUROCOMPUTING, 2023, 531 : 163 - 179
  • [47] HARDWARE IMPLEMENTATION OF STOCHASTIC SPIKING NEURAL NETWORKS
    Rossello, Josep L.
    Canals, Vincent
    Morro, Antoni
    Oliver, Antoni
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2012, 22 (04)
  • [48] Spiking neural networks for autonomous driving: A review
    Martinez, Fernando S.
    Casas-Roma, Jordi
    Subirats, Laia
    Parada, Raul
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 138
  • [49] Supervised Learning in Multilayer Spiking Neural Networks
    Sporea, Ioana
    Gruening, Andre
    NEURAL COMPUTATION, 2013, 25 (02) : 473 - 509
  • [50] Mapping Spiking Neural Networks to Neuromorphic Hardware
    Balaji, Adarsha
    Das, Anup
    Wu, Yuefeng
    Huynh, Khanh
    Dell'Anna, Francesco G.
    Indiveri, Giacomo
    Krichmar, Jeffrey L.
    Dutt, Nikil D.
    Schaafsma, Siebren
    Catthoor, Francky
    IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2020, 28 (01) : 76 - 86