Quantized rewiring: hardware-aware training of sparse deep neural networks

被引:1
作者
Petschenig, Horst [1 ]
Legenstein, Robert [1 ]
机构
[1] Graz Univ Technol, Inst Theoret Comp Sci, A-8010 Graz, Austria
来源
NEUROMORPHIC COMPUTING AND ENGINEERING | 2023年 / 3卷 / 02期
基金
奥地利科学基金会;
关键词
network rewiring; hardware-aware training; sparse networks; efficient networks; weight quantization; spiking neural networks; ON-CHIP; ARCHITECTURE; PLASTICITY; SYSTEM; LOIHI;
D O I
10.1088/2634-4386/accd8f
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Mixed-signal and fully digital neuromorphic systems have been of significant interest for deploying spiking neural networks in an energy-efficient manner. However, many of these systems impose constraints in terms of fan-in, memory, or synaptic weight precision that have to be considered during network design and training. In this paper, we present quantized rewiring (Q-rewiring), an algorithm that can train both spiking and non-spiking neural networks while meeting hardware constraints during the entire training process. To demonstrate our approach, we train both feedforward and recurrent neural networks with a combined fan-in/weight precision limit, a constraint that is, for example, present in the DYNAP-SE mixed-signal analog/digital neuromorphic processor. Q-rewiring simultaneously performs quantization and rewiring of synapses and synaptic weights through gradient descent updates and projecting the trainable parameters to a constraint-compliant region. Using our algorithm, we find trade-offs between the number of incoming connections to neurons and network performance for a number of common benchmark datasets.
引用
收藏
页数:15
相关论文
共 62 条
[1]   CAN PROGRAMMING BE LIBERATED FROM VON NEUMANN STYLE - FUNCTIONAL STYLE AND ITS ALGEBRA OF PROGRAMS [J].
BACKUS, J .
COMMUNICATIONS OF THE ACM, 1978, 21 (08) :613-641
[2]  
Bellec G, 2018, Arxiv, DOI [arXiv:1711.05136, 10.48550/arXiv.1711.05136]
[3]  
Bellec G, 2018, ADV NEUR IN, V31
[4]   A solution to the learning dilemma for recurrent networks of spiking neurons [J].
Bellec, Guillaume ;
Scherr, Franz ;
Subramoney, Anand ;
Hajek, Elias ;
Salaj, Darjan ;
Legenstein, Robert ;
Maass, Wolfgang .
NATURE COMMUNICATIONS, 2020, 11 (01)
[5]   Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations [J].
Benjamin, Ben Varkey ;
Gao, Peiran ;
McQuinn, Emmett ;
Choudhary, Swadesh ;
Chandrasekaran, Anand R. ;
Bussat, Jean-Marie ;
Alvarez-Icaza, Rodrigo ;
Arthur, John V. ;
Merolla, Paul A. ;
Boahen, Kwabena .
PROCEEDINGS OF THE IEEE, 2014, 102 (05) :699-716
[6]   Structural plasticity on an accelerated analog neuromorphic hardware system [J].
Billaudelle, Sebastian ;
Cramer, Benjamin ;
Petrovici, Mihai A. ;
Schreiber, Korbinian ;
Kappel, David ;
Schemmel, Johannes ;
Meier, Karlheinz .
NEURAL NETWORKS, 2021, 133 :11-20
[7]  
Bishop C., 2006, Pattern Recognition and Machine Learning
[8]   Online Spatio-Temporal Learning in Deep Neural Networks [J].
Bohnstingl, Thomas ;
Wozniak, Stanislaw ;
Pantazi, Angeliki ;
Eleftheriou, Evangelos .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) :8894-8908
[9]   Surrogate gradients for analog neuromorphic computing [J].
Cramer, Benjamin ;
Billaudelle, Sebastian ;
Kanya, Simeon ;
Leibfried, Aron ;
Grubl, Andreas ;
Karasenko, Vitali ;
Pehle, Christian ;
Schreiber, Korbinian ;
Stradmann, Yannik ;
Weis, Johannes ;
Schemmel, Johannes ;
Zenke, Friedemann .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2022, 119 (04)
[10]   Randaugment: Practical automated data augmentation with a reduced search space [J].
Cubuk, Ekin D. ;
Zoph, Barret ;
Shlens, Jonathon ;
Le, Quoc, V .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, :3008-3017