Adaptive Feature Self-Attention in Spiking Neural Networks for Hyperspectral Classification

被引:1
|
作者
Li, Heng [1 ]
Tu, Bing [1 ]
Liu, Bo [1 ]
Li, Jun [2 ]
Plaza, Antonio [3 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Inst Opt & Elect, Jiangsu Engn Res Ctr Intelligent Optoelect Sensing, Sch Phys & Optoelect Engn,State Key Lab Cultivat B, Nanjing 210044, Jiangsu, Peoples R China
[2] China Univ Geosci, Fac Comp Sci, Wuhan 430074, Peoples R China
[3] Univ Extremadura, Escuela Politecn, Dept Technol Comp & Commun, Hyperspectral Comp Lab, Caceres 10003, Spain
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2025年 / 63卷
基金
中国国家自然科学基金;
关键词
Hyperspectral imaging; Accuracy; Feature extraction; Convolution; Computational modeling; Spiking neural networks; Training; Image classification; Energy consumption; Adaptation models; Convolutional neural networks (CNNs); hyperspectral image (HSI) classification; spike self-attention (SSA); spiking neural networks (SNNs); IMAGES;
D O I
10.1109/TGRS.2024.3516742
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Hyperspectral image (HSI) classification is crucial for remote sensing research, while its high-dimensional features make traditional algorithms difficult to cope with. Despite the breakthroughs in deep learning, the high computational complexity and energy consumption limit its application in resource-limited environments. Spiking neural networks (SNNs), mimicking the brain's information processing with low power consumption, have emerged as a promising alternative for edge computing. However, SNNs struggle with complex tasks due to the nondifferentiability of spike signals, which complicates training and exhibits limitations in extracting deep features and modeling long-range dependencies. In this article, we propose a novel SNN framework that addresses these challenges by enhancing feature extraction and efficiently capturing dependencies in hyperspectral data. Our framework integrates an adaptive refocusing convolutional layer with a spike self-attention (SSA) mechanism. The adaptive refocusing convolutional layer employs learnable parameters to dynamically adjust the convolutional kernel's response to input spike data, improving feature representation. The adaptive refocusing convolutional layer uses learnable parameters to dynamically adjust kernel responses to input spike data, enhancing feature representation. Experimental results show that this model achieves over 96% classification accuracy in a single time step, significantly surpassing current methods and effectively solving the problem of low accuracy at short time steps in SNNs. Additionally, this framework reduces computational energy consumption by approximately 12.5x compared to similar, offering new potential for edge intelligence applications.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Ultralightweight Feature-Compressed Multihead Self-Attention Learning Networks for Hyperspectral Image Classification
    Li, Xinhao
    Xu, Mingming
    Liu, Shanwei
    Sheng, Hui
    Wan, Jianhua
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 14
  • [2] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [3] Spatial-Temporal Self-Attention for Asynchronous Spiking Neural Networks
    Wang, Yuchen
    Shi, Kexin
    Lu, Chengzhuo
    Liu, Yuguo
    Zhang, Malu
    Qu, Hong
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 3085 - 3093
  • [4] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    Science China(Information Sciences), 2024, 67 (04) : 301 - 313
  • [5] Spectral-Spatial Self-Attention Networks for Hyperspectral Image Classification
    Zhang, Xuming
    Sun, Genyun
    Jia, Xiuping
    Wu, Lixin
    Zhang, Aizhu
    Ren, Jinchang
    Fu, Hang
    Yao, Yanjuan
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [6] Self-Attention Enhanced Recurrent Neural Networks for Sentence Classification
    Kumar, Ankit
    Rastogi , Reshma
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 905 - 911
  • [7] Spiking neural self-attention network for sequence recommendation
    Bai, Xinzhu
    Huang, Yanping
    Peng, Hong
    Yang, Qian
    Wang, Jun
    Liu, Zhicai
    APPLIED SOFT COMPUTING, 2025, 169
  • [8] Lightweight Self-Attention Residual Network for Hyperspectral Classification
    Xia, Jinbiao
    Cui, Ying
    Li, Wenshan
    Wang, Liguo
    Wang, Chao
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [9] Feature Importance Estimation with Self-Attention Networks
    Skrlj, Blaz
    Dzeroski, Saso
    Lavrac, Nada
    Petkovic, Matej
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1491 - 1498
  • [10] SAOCNN: Self-Attention and One-Class Neural Networks for Hyperspectral Anomaly Detection
    Wang, Jinshen
    Ouyang, Tongbin
    Duan, Yuxiao
    Cui, Linyan
    REMOTE SENSING, 2022, 14 (21)