Efficient Structure Slimming for Spiking Neural Networks

被引:9
作者
Li Y. [1 ]
Fang X. [1 ]
Gao Y. [1 ]
Zhou D. [1 ]
Shen J. [2 ]
Liu J.K. [3 ]
Pan G. [2 ]
Xu Q. [1 ]
机构
[1] Dalian University of Technology, School of Computer Science and Technology, Dalian
[2] Zhejiang University, College of Computer Science and Technology, Zhejiang
[3] University of Birmingham, College of Engineering and Physical Sciences, Birmingham
来源
IEEE Transactions on Artificial Intelligence | 2024年 / 5卷 / 08期
基金
中国国家自然科学基金;
关键词
Channel pruning; network slimming; spiking neural networks; structure learning; weight pruning;
D O I
10.1109/TAI.2024.3352533
中图分类号
学科分类号
摘要
Spiking neural networks (SNNs) are deeply inspired by biological neural information systems. Compared to convolutional neural networks (CNNs), SNNs are low power consumption because of their spike based information processing mechanism. However, most of the current structures of SNNs are fully connected or converted from deep CNNs which poses redundancy connections. While the structure and topology in human brain systems are sparse and efficient. This article aims at taking full advantage of sparse structure and low power consumption which lie in human brain and proposed efficient structure slimming methods. Inspired by the development of biological neural network structures, this article designed types of structure slimming methods including neuron pruning and channel pruning. In addition to pruning, this article also considers the growth and development of the nervous system. Through iterative application of the proposed neural pruning and rewiring algorithms, experimental evaluations on Canadian Institute for Advanced Research (CIFAR)-10, CIFAR-100, and dynamic vision sensor (DVS)-gesture datasets demonstrate the effectiveness of the structure slimming methods. When the parameter count is reduced to only about 10% of the original, the performance decreases by less than 1%. © 2020 IEEE.
引用
收藏
页码:3823 / 3831
页数:8
相关论文
共 36 条
[1]  
Biebl M., Cooper C.M., Winkler J., Kuhn H.G., Analysis of neurogenesis and programmed cell death reveals a self-renewing capacity in the adult rat brain, Neuroscience Lett, 291, 1, pp. 17-20, (2000)
[2]  
Spalding K.L., Et al., Dynamics of hippocampal neurogenesis in adult humans, Cell, 153, 6, pp. 1219-1227, (2013)
[3]  
Deng L., Et al., Comprehensive SNN compression using ADMM optimization and activity regularization, IEEE Trans. Neural Netw. Learn. Syst, 34, 6, pp. 2791-2805, (2023)
[4]  
Chen Y., Yu Z., Fang W., Huang T., Tian Y., Pruning of deep spiking neural networks through gradient rewiring, (2021)
[5]  
Chen Y., Yu Z., Fang W., Ma Z., Huang T., Tian Y., State transition of dendritic spines improves learning of sparse spiking neural networks, Proc. Int. Conf. Mach. Learn., PMLR, pp. 3701-3715, (2022)
[6]  
Fang W., Yu Z., Chen Y., Huang T., Masquelier T., Tian Y., Deep residual learning in spiking neural networks, Proc. Adv. Neural Inf. Process. Syst, 34, pp. 21056-21069, (2021)
[7]  
Han S., Pool J., Tran J., Dally W., Learning both weights and connections for efficient neural networks, Proc. Adv. Neural Inf. Process. Syst, 28, pp. 1135-1143, (2015)
[8]  
Lennie P., The cost of cortical computation, Current Biol, 13, 6, pp. 493-497, (2003)
[9]  
Liu Z., Li J., Shen Z., Huang G., Yan S., Zhang C., Learning efficient convolutional networks through network slimming, Proc. IEEE Int. Conf. Comput. Vis, pp. 2736-2744, (2017)
[10]  
Song S., Miller K.D., Abbott L.F., Competitive Hebbian learning through spike-timing-dependent synaptic plasticity, Nature Neurosci, 3, 9, pp. 919-926, (2000)