STSF: Spiking Time Sparse Feedback Learning for Spiking Neural Networks

被引:0
作者
He, Ping [1 ,2 ]
Xiao, Rong [1 ,2 ]
Tang, Chenwei [1 ,2 ]
Huang, Shudong [1 ,2 ]
Lv, Jiancheng [1 ,2 ]
Tang, Huajin [3 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China
[2] Minist Educ, Engn Res Ctr Machine Learning & Ind Intelligence, Chengdu 610065, Peoples R China
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
基金
中国国家自然科学基金;
关键词
Global-local spiking learning; sparse direct feedback alignment (DFA); spiking neural networks (SNNs); vanilla spike-timing-dependent plasticity (STDP); OPTIMIZATION; PLASTICITY; NEURONS;
D O I
10.1109/TNNLS.2025.3527700
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are biologically plausible models known for their computational efficiency. A significant advantage of SNNs lies in the binary information transmission through spike trains, eliminating the need for multiplication operations. However, due to the spatio-temporal nature of SNNs, direct application of traditional backpropagation (BP) training still results in significant computational costs. Meanwhile, learning methods based on unsupervised synaptic plasticity provide an alternative for training SNNs but often yield suboptimal results. Thus, efficiently training high-accuracy SNNs remains a challenge. In this article, we propose a highly efficient and biologically plausible spiking time sparse feedback (STSF) learning method. This algorithm modifies synaptic weights by incorporating a neuromodulator for global supervised learning using sparse direct feedback alignment (DFA) and local homeostasis learning with vanilla spike-timing-dependent plasticity (STDP). Such neuromorphic global-local learning focuses on instantaneous synaptic activity, enabling independent and simultaneous optimization of each network layer, thereby improving biological plausibility, enhancing parallelism, and reducing storage overhead. Incorporating sparse fixed random feedback connections for global error modulation, which uses selection operations instead of multiplication operations, further improves computational efficiency. Experimental results demonstrate that the proposed algorithm markedly reduces the computational cost with significantly higher accuracy comparable to current state-of-the-art algorithms across a wide range of classification tasks. Our implementation codes are available at https://github.com/hppeace/STSF.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] A Survey on Spiking Neural Networks
    Han, Chan Sik
    Lee, Keon Myung
    INTERNATIONAL JOURNAL OF FUZZY LOGIC AND INTELLIGENT SYSTEMS, 2021, 21 (04) : 317 - 337
  • [22] Learning by Stimulation Avoidance as a Primary Principle of Spiking Neural Networks Dynamics
    Sinapayen, Lana
    Masumori, Atsushi
    Virgo, Nathaniel
    Ikegami, Takashi
    ECAL 2015: THE THIRTEENTH EUROPEAN CONFERENCE ON ARTIFICIAL LIFE, 2015, : 175 - 182
  • [23] Spiking Neural Networks: A Survey
    Nunes, Joao D.
    Carvalho, Marcelo
    Carneiro, Diogo
    Cardoso, Jaime S.
    IEEE ACCESS, 2022, 10 : 60738 - 60764
  • [24] Training Feedback Spiking Neural Networks by Implicit Differentiation on the Equilibrium State
    Xiao, Mingqing
    Meng, Qingyan
    Zhang, Zongpeng
    Wang, Yisen
    Lin, Zhouchen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [25] Spiking neural networks for autonomous driving: A review
    Martinez, Fernando S.
    Casas-Roma, Jordi
    Subirats, Laia
    Parada, Raul
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 138
  • [26] Mapping Spiking Neural Networks to Neuromorphic Hardware
    Balaji, Adarsha
    Das, Anup
    Wu, Yuefeng
    Huynh, Khanh
    Dell'Anna, Francesco G.
    Indiveri, Giacomo
    Krichmar, Jeffrey L.
    Dutt, Nikil D.
    Schaafsma, Siebren
    Catthoor, Francky
    IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2020, 28 (01) : 76 - 86
  • [27] Optimal Heterogeneity for Coding in Spiking Neural Networks
    Mejias, J. F.
    Longtin, A.
    PHYSICAL REVIEW LETTERS, 2012, 108 (21)
  • [28] Plasticity in memristive devices for spiking neural networks
    Saighi, Sylvain
    Mayr, Christian G.
    Serrano-Gotarredona, Teresa
    Schmidt, Heidemarie
    Lecerf, Gwendal
    Tomas, Jean
    Grollier, Julie
    Boyn, Soeren
    Vincent, Adrien F.
    Querlioz, Damien
    La Barbera, Selina
    Alibart, Fabien
    Vuillaume, Dominique
    Bichler, Olivier
    Gamrat, Christian
    Linares-Barranco, Bernabe
    FRONTIERS IN NEUROSCIENCE, 2015, 9
  • [29] Smooth Exact Gradient Descent Learning in Spiking Neural Networks
    Klos, Christian
    Memmesheimer, Raoul-Martin
    PHYSICAL REVIEW LETTERS, 2025, 134 (02)
  • [30] Advancing Spiking Neural Networks Toward Deep Residual Learning
    Hu, Yifan
    Deng, Lei
    Wu, Yujie
    Yao, Man
    Li, Guoqi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 2353 - 2367