Effective Active Learning Method for Spiking Neural Networks

被引:1
作者
Xie, Xiurui [1 ]
Yu, Bei [1 ]
Liu, Guisong [2 ,3 ]
Zhan, Qiugang [1 ]
Tang, Huajin [4 ,5 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Southwestern Univ Finance & Econ, Sch Comp & Artificial Intelligence, Chengdu 611130, Peoples R China
[3] Univ Elect Sci & Technol China, Zhongshan Inst, Zhongshan 528400, Peoples R China
[4] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[5] Zhejiang Lab, Hangzhou 311122, Peoples R China
关键词
Biological system modeling; Neurons; Learning systems; Predictive models; Training; Task analysis; Integrated circuit modeling; Active learning method; deep learning; feature representation; spiking neural network (SNN);
D O I
10.1109/TNNLS.2023.3257333
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A large quantity of labeled data is required to train high-performance deep spiking neural networks (SNNs), but obtaining labeled data is expensive. Active learning is proposed to reduce the quantity of labeled data required by deep learning models. However, conventional active learning methods in SNNs are not as effective as that in conventional artificial neural networks (ANNs) because of the difference in feature representation and information transmission. To address this issue, we propose an effective active learning method for a deep SNN model in this article. Specifically, a loss prediction module ActiveLossNet is proposed to extract features and select valuable samples for deep SNNs. Then, we derive the corresponding active learning algorithm for deep SNN models. Comprehensive experiments are conducted on CIFAR-10, MNIST, Fashion-MNIST, and SVHN on different SNN frameworks, including seven-layer CIFARNet and 20-layer ResNet-18. The comparison results demonstrate that the proposed active learning algorithm outperforms random selection and conventional ANN active learning methods. In addition, our method converges faster than conventional active learning methods.
引用
收藏
页码:12373 / 12382
页数:10
相关论文
共 45 条
[11]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[12]  
Hoi S., 2006, ICML, DOI DOI 10.1145/1143844.1143897
[13]   Spiking Deep Residual Networks [J].
Hu, Yangfan ;
Tang, Huajin ;
Pan, Gang .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (08) :5200-5205
[14]  
Joshi Ajay J., 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), P2372, DOI 10.1109/CVPRW.2009.5206627
[15]  
Kim J., 2020, Advances in Neural Information Processing Systems, V33, P19534
[16]  
Konyushkova K., 2018, ARXIV
[17]  
Krizhevsky Alex., 2009, Learning multiple layers of features from tiny images
[18]   Gradient-based learning applied to document recognition [J].
Lecun, Y ;
Bottou, L ;
Bengio, Y ;
Haffner, P .
PROCEEDINGS OF THE IEEE, 1998, 86 (11) :2278-2324
[19]  
Ledinauskas E., 2020, ARXIV
[20]  
Lee Cheolhei, 2023, IEEE Transactions on Automation Science and Engineering, P2215, DOI [10.1109/tase.2022.3213827, 10.1109/TASE.2022.3213827]