Temporal Effective Batch Normalization in Spiking Neural Networks

被引:0
|
作者
Duan, Chaoteng [1 ]
Ding, Jianhao [2 ]
Chen, Shiyan [1 ]
Yu, Zhaofei [3 ]
Huang, Tiejun [2 ]
机构
[1] Peking Univ, Sch Elect & Comp Engn, Beijing 100871, Peoples R China
[2] Peking Univ, Sch Comp Sci, Beijing 100871, Peoples R China
[3] Peking Univ, Sch Comp Sci, Inst Artificial Intelligence, Beijing 100871, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
中国国家自然科学基金;
关键词
NEURONS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) are promising in neuromorphic hardware owing to utilizing spatio-temporal information and sparse event-driven signal processing. However, it is challenging to train SNNs due to the non-differentiable nature of the binary firing function. The surrogate gradients alleviate the training problem and make SNNs obtain comparable performance as Artificial Neural Networks (ANNs) with the same structure. Unfortunately, batch normalization, contributing to the success of ANNs, does not play a prominent role in SNNs because of the additional temporal dimension. To this end, we propose an effective normalization method called temporal effective batch normalization (TEBN). By rescaling the presynaptic inputs with different weights at every time-step, temporal distributions become smoother and uniform. Theoretical analysis shows that TEBN can be viewed as a smoother of SNN's optimization landscape and could help stabilize the gradient norm. Experimental results on both static and neuromorphic datasets show that SNNs with TEBN outperform the state-of-the-art accuracy with fewer time-steps, and achieve better robustness to hyper-parameters than other normalizations.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Rethinking the Role of Normalization and Residual Blocks for Spiking Neural Networks
    Ikegawa, Shin-ichi
    Saiin, Ryuji
    Sawada, Yoshihide
    Natori, Naotake
    SENSORS, 2022, 22 (08)
  • [3] Temporal Pattern Coding in Deep Spiking Neural Networks
    Rueckauer, Bodo
    Liu, Shih-Chii
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [4] Effective Active Learning Method for Spiking Neural Networks
    Xie, Xiurui
    Yu, Bei
    Liu, Guisong
    Zhan, Qiugang
    Tang, Huajin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12373 - 12382
  • [5] Effective Transfer Learning Algorithm in Spiking Neural Networks
    Zhan, Qiugang
    Liu, Guisong
    Xie, Xiurui
    Sun, Guolin
    Tang, Huajin
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13323 - 13335
  • [6] The impact of encoding-decoding schemes and weight normalization in spiking neural networks
    Liang, Zhengzhong
    Schwartz, David
    Ditzler, Gregory
    Koyluoglu, O. Ozan
    NEURAL NETWORKS, 2018, 108 : 365 - 378
  • [7] Temporal Dependent Local Learning for Deep Spiking Neural Networks
    Ma, Chenxiang
    Xu, Junhai
    Yu, Qiang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [8] Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding
    Gardner, Brian
    Gruning, Andre
    PLOS ONE, 2016, 11 (08):
  • [9] A Spatial-Channel-Temporal-Fused Attention for Spiking Neural Networks
    Cai, Wuque
    Sun, Hongze
    Liu, Rui
    Cui, Yan
    Wang, Jun
    Xia, Yang
    Yao, Dezhong
    Guo, Daqing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 14315 - 14329
  • [10] Personalised modelling with spiking neural networks integrating temporal and static information
    Doborjeh, Maryam
    Kasabov, Nikola
    Doborjeh, Zohreh
    Enayatollahi, Reza
    Tu, Enmei
    Gandomi, Amir H.
    NEURAL NETWORKS, 2019, 119 : 162 - 177