Global-Local Convolution with Spiking Neural Networks for Energy-efficient Keyword Spotting

被引:0
|
作者
Wang, Shuai [1 ]
Zhang, Dehao [1 ]
Shi, Kexin [1 ]
Wang, Yuchen [1 ]
Wei, Wenjie [1 ]
Wu, Jibin [2 ]
Zhang, Malu [1 ]
机构
[1] Univ Elect Sci & Technol China, Chengdu, Sichuan, Peoples R China
[2] Hong Kong Polytech Univ, Hong Kong, Peoples R China
来源
INTERSPEECH 2024 | 2024年
基金
美国国家科学基金会;
关键词
Keyword spotting; Spiking neural networks; Global-Local spiking convolution;
D O I
10.21437/Interspeech.2024-642
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Thanks to Deep Neural Networks (DNNs), the accuracy of Keyword Spotting (KWS) has made substantial progress. However, as KWS systems are usually implemented on edge devices, energy efficiency becomes a critical requirement besides performance. Here, we take advantage of spiking neural networks' energy efficiency and propose an end-to-end lightweight KWS model. The model consists of two innovative modules: 1) Global-Local Spiking Convolution (GLSC) module and 2) Bottleneck-PLIF module. Compared to the hand-crafted feature extraction methods, the GLSC module achieves speech feature extraction that is sparser, more energy-efficient, and yields better performance. The Bottleneck-PLIF module further processes the signals from GLSC with the aim to achieve higher accuracy with fewer parameters. Extensive experiments are conducted on the Google Speech Commands Dataset (V1 and V2). The results show our method achieves competitive performance among SNN-based KWS models with fewer parameters.
引用
收藏
页码:4523 / 4527
页数:5
相关论文
共 50 条
  • [1] Deep Convolutional Spiking Neural Networks for Keyword Spotting
    Yilmaz, Emre
    Gevrek, Ozgur Bora
    Wu, Jibin
    Chen, Yuxiang
    Meng, Xuanbo
    Li, Haizhou
    INTERSPEECH 2020, 2020, : 2557 - 2561
  • [2] Small-footprint Spiking Neural Networks for Power-efficient Keyword Spotting
    Pedroni, Bruno U.
    Sheik, Sadique
    Mostafa, Hesham
    Paul, Somnath
    Augustine, Charles
    Cauwenberghs, Gert
    2018 IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE (BIOCAS): ADVANCED SYSTEMS FOR ENHANCING HUMAN HEALTH, 2018, : 591 - 594
  • [3] Neural Dynamics Pruning for Energy-Efficient Spiking Neural Networks
    Huang, Haoyu
    He, Linxuan
    Liu, Faqiang
    Zhao, Rong
    Shi, Luping
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,
  • [4] BitSNNs: Revisiting Energy-Efficient Spiking Neural Networks
    Hu, Yangfan
    Zheng, Qian
    Pan, Gang
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) : 1736 - 1747
  • [5] AutoSNN: Towards Energy-Efficient Spiking Neural Networks
    Na, Byunggook
    Mok, Jisoo
    Park, Seongsik
    Lee, Dongjin
    Choe, Hyeokjun
    Yoon, Sungroh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] Dynamic Spike Bundling for Energy-Efficient Spiking Neural Networks
    Krithivasan, Sarada
    Sen, Sanchari
    Venkataramani, Swagath
    Raghunathan, Anand
    2019 IEEE/ACM INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED), 2019,
  • [7] Towards Energy-Efficient Sentiment Classification with Spiking Neural Networks
    Chen, Junhao
    Ye, Xiaojun
    Sun, Jingbo
    Li, Chao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 518 - 529
  • [8] Efficient keyword spotting using time delay neural networks
    Myer, Samuel
    Tomar, Vikrant Singh
    19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 1264 - 1268
  • [9] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Luo, Yihao
    Shen, Haibo
    Cao, Xiang
    Wang, Tianjiang
    Feng, Qi
    Tan, Zehan
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (12): : 9967 - 9982
  • [10] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Yihao Luo
    Haibo Shen
    Xiang Cao
    Tianjiang Wang
    Qi Feng
    Zehan Tan
    Neural Computing and Applications, 2022, 34 : 9967 - 9982