A Spatial-Channel-Temporal-Fused Attention for Spiking Neural Networks

被引:5
|
作者
Cai, Wuque [1 ]
Sun, Hongze [1 ]
Liu, Rui [1 ]
Cui, Yan [2 ]
Wang, Jun [1 ]
Xia, Yang [1 ]
Yao, Dezhong [1 ,3 ,4 ]
Guo, Daqing [1 ]
机构
[1] Univ Elect Sci & Technol China, Clin Hosp Chengdu Brain Sci Inst, Sch Life Sci & Technol, Minist Educ MOE Key Lab NeuroInformat, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Sichuan Prov Peoples Hosp, Dept Neurosurg, Chengdu 610072, Peoples R China
[3] Chinese Acad Med Sci, Res Unit NeuroInformat 2019RU035, Chengdu 611731, Peoples R China
[4] Zhengzhou Univ, Sch Elect Engn, Zhengzhou 450001, Peoples R China
基金
中国国家自然科学基金;
关键词
Neurons; Visualization; Biological system modeling; Training; Computational modeling; Membrane potentials; Biological neural networks; Event streams; predictive attentional remapping; spatial-channel-temporal-fused attention (SCTFA); spiking neural networks (SNNs); visual attention; VISUAL-ATTENTION;
D O I
10.1109/TNNLS.2023.3278265
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) mimic brain computational strategies, and exhibit substantial capabilities in spatiotemporal information processing. As an essential factor for human perception, visual attention refers to the dynamic process for selecting salient regions in biological vision systems. Although visual attention mechanisms have achieved great success in computer vision applications, they are rarely introduced into SNNs. Inspired by experimental observations on predictive attentional remapping, we propose a new spatial-channel-temporal-fused attention (SCTFA) module that can guide SNNs to efficiently capture underlying target regions by utilizing accumulated historical spatial-channel information in the present study. Through a systematic evaluation on three event stream datasets (DVS Gesture, SL-Animals-DVS, and MNIST-DVS), we demonstrate that the SNN with the SCTFA module (SCTFA-SNN) not only significantly outperforms the baseline SNN (BL-SNN) and two other SNN models with degenerated attention modules, but also achieves competitive accuracy with the existing state-of-the-art (SOTA) methods. Additionally, our detailed analysis shows that the proposed SCTFA-SNN model has strong robustness to noise and outstanding stability when faced with incomplete data, while maintaining acceptable complexity and efficiency. Overall, these findings indicate that incorporating appropriate cognitive mechanisms of the brain may provide a promising approach to elevate the capabilities of SNNs.
引用
收藏
页码:14315 / 14329
页数:15
相关论文
共 50 条
  • [1] Spatial-Temporal Self-Attention for Asynchronous Spiking Neural Networks
    Wang, Yuchen
    Shi, Kexin
    Lu, Chengzhuo
    Liu, Yuguo
    Zhang, Malu
    Qu, Hong
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 3085 - 3093
  • [2] TCJA-SNN: Temporal-Channel Joint Attention for Spiking Neural Networks
    Zhu, Rui-Jie
    Zhang, Malu
    Zhao, Qihang
    Deng, Haoyu
    Duan, Yule
    Deng, Liang-Jian
    arXiv, 2022,
  • [3] TCJA-SNN: Temporal-Channel Joint Attention for Spiking Neural Networks
    Zhu, Rui-Jie
    Zhang, Malu
    Zhao, Qihang
    Deng, Haoyu
    Duan, Yule
    Deng, Liang-Jian
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14
  • [4] STCA-SNN: self-attention-based temporal-channel joint attention for spiking neural networks
    Wu, Xiyan
    Song, Yong
    Zhou, Ya
    Jiang, Yurong
    Bai, Yashuo
    Li, Xinyi
    Yang, Xin
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [5] Attention Spiking Neural Networks
    Yao, Man
    Zhao, Guangshe
    Zhang, Hengyu
    Hu, Yifan
    Deng, Lei
    Tian, Yonghong
    Xu, Bo
    Li, Guoqi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 9393 - 9410
  • [6] Spatial Channel Attention for Deep Convolutional Neural Networks
    Liu, Tonglai
    Luo, Ronghai
    Xu, Longqin
    Feng, Dachun
    Cao, Liang
    Liu, Shuangyin
    Guo, Jianjun
    MATHEMATICS, 2022, 10 (10)
  • [7] Channel attention-based spatial-temporal graph neural networks for traffic prediction
    Wang, Bin
    Gao, Fanghong
    Tong, Le
    Zhang, Qian
    Zhu, Sulei
    DATA TECHNOLOGIES AND APPLICATIONS, 2023, 58 (01) : 81 - 94
  • [8] Temporal-wise Attention Spiking Neural Networks for Event Streams Classification
    Yao, Man
    Gao, Huanhuan
    Zhao, Guangshe
    Wang, Dingheng
    Lin, Yihan
    Yang, Zhaoxu
    Li, Guoqi
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10201 - 10210
  • [9] Spike Attention Coding for Spiking Neural Networks
    Liu, Jiawen
    Hu, Yifan
    Li, Guoqi
    Pei, Jing
    Deng, Lei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (12) : 18892 - 18898
  • [10] Attention-Based Deep Spiking Neural Networks for Temporal Credit Assignment Problems
    Qin, Lang
    Wang, Ziming
    Yan, Rui
    Tang, Huajin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 10301 - 10311