Effective Active Learning Method for Spiking Neural Networks

被引:1
作者
Xie, Xiurui [1 ]
Yu, Bei [1 ]
Liu, Guisong [2 ,3 ]
Zhan, Qiugang [1 ]
Tang, Huajin [4 ,5 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Southwestern Univ Finance & Econ, Sch Comp & Artificial Intelligence, Chengdu 611130, Peoples R China
[3] Univ Elect Sci & Technol China, Zhongshan Inst, Zhongshan 528400, Peoples R China
[4] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[5] Zhejiang Lab, Hangzhou 311122, Peoples R China
关键词
Biological system modeling; Neurons; Learning systems; Predictive models; Training; Task analysis; Integrated circuit modeling; Active learning method; deep learning; feature representation; spiking neural network (SNN);
D O I
10.1109/TNNLS.2023.3257333
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A large quantity of labeled data is required to train high-performance deep spiking neural networks (SNNs), but obtaining labeled data is expensive. Active learning is proposed to reduce the quantity of labeled data required by deep learning models. However, conventional active learning methods in SNNs are not as effective as that in conventional artificial neural networks (ANNs) because of the difference in feature representation and information transmission. To address this issue, we propose an effective active learning method for a deep SNN model in this article. Specifically, a loss prediction module ActiveLossNet is proposed to extract features and select valuable samples for deep SNNs. Then, we derive the corresponding active learning algorithm for deep SNN models. Comprehensive experiments are conducted on CIFAR-10, MNIST, Fashion-MNIST, and SVHN on different SNN frameworks, including seven-layer CIFARNet and 20-layer ResNet-18. The comparison results demonstrate that the proposed active learning algorithm outperforms random selection and conventional ANN active learning methods. In addition, our method converges faster than conventional active learning methods.
引用
收藏
页码:12373 / 12382
页数:10
相关论文
共 45 条
[1]   The power of ensembles for active learning in image classification [J].
Beluch, William H. ;
Genewein, Tim ;
Nuernberger, Andreas ;
Koehler, Jan M. .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :9368-9377
[2]  
Bilgic M., 2009, P NIPS WORKSH AN NET, P1
[3]   A review of the integrate-and-fire neuron model: I. Homogeneous synaptic input [J].
Burkitt, A. N. .
BIOLOGICAL CYBERNETICS, 2006, 95 (01) :1-19
[4]   Active learning with statistical models [J].
Cohn, DA ;
Ghahramani, Z ;
Jordan, MI .
JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 1996, 4 :129-145
[5]  
Fang W, 2021, ADV NEUR IN, V34
[6]   Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks [J].
Fang, Wei ;
Yu, Zhaofei ;
Chen, Yanqi ;
Masquelier, Timothee ;
Huang, Tiejun ;
Tian, Yonghong .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :2641-2651
[7]  
Freytag A, 2014, LECT NOTES COMPUT SC, V8692, P562, DOI 10.1007/978-3-319-10593-2_37
[8]   SPIKING NEURAL NETWORKS [J].
Ghosh-Dastidar, Samanwoy ;
Adeli, Hojjat .
INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2009, 19 (04) :295-308
[9]  
Gissin Daniel, 2019, CoRR
[10]  
Guo Yuhong., 2010, ADV NEURAL INFORM PR, P802