Temporal Dependent Local Learning for Deep Spiking Neural Networks

被引:14
|
作者
Ma, Chenxiang [1 ]
Xu, Junhai [1 ]
Yu, Qiang [1 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin Key Lab Cognit Comp & Applicat, Tianjin, Peoples R China
来源
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2021年
基金
中国国家自然科学基金;
关键词
NEURONS;
D O I
10.1109/IJCNN52387.2021.9534390
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are promising to replicate the efficiency of the brain by utilizing a paradigm of spike-based computation. Training a deep SNN is of great importance for solving practical tasks as well as discovering the fascinating capability of spike-based computation. The biologically plausible scheme of local learning motivates many approaches that enable training deep networks in an efficient parallel way. However, most of the existing spike-based local learning approaches show relatively low performances on challenging tasks. In this paper, we propose a new spike-based temporal dependent local learning (TDLL) algorithm, where each hidden layer of a deep SNN is independently trained with an auxiliary trainable spiking projection layer, and temporal dependency is fully employed to construct local errors for adjusting parameters. We examine the performance of the proposed TDLL with various networks on the MNIST, Fashion-MNIST, SVHN and CIFAR-10 datasets. Experimental results highlight that our method can scale up to larger networks, and more importantly, achieves relatively high accuracies on all benchmarks, which are even competitive with the ones obtained by global backpropagation-based methods. This work therefore contributes to providing an effective and efficient local learning method for deep SNNs, which could greatly benefit the developments of distributed neuromorphic computing.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [2] Temporal Contrastive Learning for Spiking Neural Networks
    Qiu, Haonan
    Song, Zeyin
    Chen, Yanqi
    Ning, Munan
    Fang, Wei
    Sun, Tao
    Ma, Zhengyu
    Yuan, Li
    Tian, Yonghong
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT X, 2024, 15025 : 422 - 436
  • [3] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [4] Deep Residual Learning in Spiking Neural Networks
    Fang, Wei
    Yu, Zhaofei
    Chen, Yanqi
    Huang, Tiejun
    Masquelier, Timothee
    Tian, Yonghong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Temporal Pattern Coding in Deep Spiking Neural Networks
    Rueckauer, Bodo
    Liu, Shih-Chii
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [7] Training Spiking Neural Networks with Local Tandem Learning
    Yang, Qu
    Wu, Jibin
    Zhang, Malu
    Chua, Yansong
    Wang, Xinchao
    Li, Haizhou
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [8] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang L.
    Du Z.
    Li L.
    Chen Y.
    High Technology Letters, 2020, 26 (02): : 136 - 144
  • [9] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    HighTechnologyLetters, 2020, 26 (02) : 136 - 144
  • [10] A HYBRID LEARNING FRAMEWORK FOR DEEP SPIKING NEURAL NETWORKS WITH ONE-SPIKE TEMPORAL CODING
    Wang, Jiadong
    Wu, Jibin
    Zhang, Malu
    Liu, Qi
    Li, Haizhou
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8942 - 8946