A Memcapacitive Spiking Neural Network with Circuit Nonlinearity-aware Training

被引:5
作者
Oshio, Reon [1 ]
Sugahara, Takuya [1 ]
Sawada, Atsushi [1 ]
Kimura, Mutsumi [1 ,2 ]
Zhang, Renyuan [1 ]
Nakashima, Yasuhiko [1 ]
机构
[1] Nara Inst Sci & Technol NAIST, Grad Sch Sci & Technol, Ikoma, Nara, Japan
[2] Ryukoku Univ, Grad Sch Sci & Technol, Otsu, Shiga, Japan
来源
IEEE SYMPOSIUM ON LOW-POWER AND HIGH-SPEED CHIPS AND SYSTEMS (2022 IEEE COOL CHIPS 25) | 2022年
关键词
Neuromorphic Computing; Spiking Neural Network (SNN); Processing-In-Memory (PIM); Analog Neuron Circuit; Memcapacitor; Hardware/Software Co-design;
D O I
10.1109/COOLCHIPS54332.2022.9772674
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Neuromorphic computing is an unconventional computing scheme that executes computable algorithms using Spiking Neural Networks (SNNs) mimicking neural dynamics with high speed and low power consumption by the dedicated hardware. The analog implementation of neuromorphic computing has been studied in the field of edge computing etc. and is considered to be superior to the digital implementation in terms of power consumption. Furthermore, It is expected to have extremely low power consumption that Processing-In-Memory (PIM) based synaptic operations using non-volatile memory (NVM) devices for both weight memory and multiply-accumulate operations. However, unintended non-linearities and hysteresis occur when attempting to implement analog spiking neuron circuits as simply as possible. As a result, it is thought to cause accuracy loss when inference is performed by mapping the weight parameters of the SNNs which trained offline to the element parameters of the NVM. In this study, we newly designed neuromorphic hardware operating at 100 MHz that employs memcapacitor as a synaptic element, which is expected to have ultra-low power consumption. We also propose a method for training SNNs that incorporate the nonlinearity of the designed circuit into the neuron model and convert the synaptic weights into circuit element parameters. The proposed training method can reduce the degradation of accuracy even for very simple neuron circuits. The proposed circuit and method classify MNIST with similar to 33.88 nJ/Inference, excluding the encoder, with similar to 97% accuracy. The circuit design and measurement of circuit characteristics were performed in Rohm 180nm process using HSPICE. A spiking neuron model that incorporates circuit non-linearity as an activation function was implemented in PyTorch, a machine learning framework for Python.
引用
收藏
页数:6
相关论文
共 50 条
[31]   Encoding, Model, and Architecture: Systematic Optimization for Spiking Neural Network in FPGAs [J].
Fang, Haowen ;
Mei, Zaidao ;
Shrestha, Amar ;
Zhao, Ziyi ;
Li, Yilan ;
Qiu, Qinru .
2020 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED-DESIGN (ICCAD), 2020,
[32]   On-Chip Unsupervised Learning Using STDP in a Spiking Neural Network [J].
Gupta, Abhinav ;
Saurabh, Sneh .
IEEE TRANSACTIONS ON NANOTECHNOLOGY, 2023, 22 :365-376
[33]   ACCURATE, ENERGY-EFFICIENT CLASSIFICATION WITH SPIKING RANDOM NEURAL NETWORK [J].
Hussain, Khaled F. ;
Bassyouni, Mohamed Yousef ;
Gelenbe, Erol .
PROBABILITY IN THE ENGINEERING AND INFORMATIONAL SCIENCES, 2021, 35 (01) :51-61
[34]   Event-Based Depth Prediction With Deep Spiking Neural Network [J].
Wu, Xiaoshan ;
He, Weihua ;
Yao, Man ;
Zhang, Ziyang ;
Wang, Yaoyuan ;
Xu, Bo ;
Li, Guoqi .
IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (06) :2008-2018
[35]   Evaluation of Encoding Schemes on Ubiquitous Sensor Signal for Spiking Neural Network [J].
Bian, Sizhen ;
Donati, Elisa ;
Magno, Michele .
IEEE SENSORS JOURNAL, 2024, 24 (21) :35008-35018
[36]   NVM Weight Variation Impact on Analog Spiking Neural Network Chip [J].
Nomura, Akiyo ;
Ito, Megumi ;
Okazaki, Atsuya ;
Ishii, Masatoshi ;
Kim, Sangbum ;
Okazawa, Junka ;
Hosokawa, Kohji ;
Haensch, Wilfried .
NEURAL INFORMATION PROCESSING (ICONIP 2018), PT VII, 2018, 11307 :676-685
[37]   Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution [J].
Parsa, Maryam ;
Kulkarni, Shruti R. ;
Coletti, Mark ;
Bassett, Jeffrey ;
Mitchell, J. Parker ;
Schuman, Catherine D. .
2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, :1225-1232
[38]   ESSA: Design of a Programmable Efficient Sparse Spiking Neural Network Accelerator [J].
Kuang, Yisong ;
Cui, Xiaoxin ;
Wang, Zilin ;
Zou, Chenglong ;
Zhong, Yi ;
Liu, Kefei ;
Dai, Zhenhui ;
Yu, Dunshan ;
Wang, Yuan ;
Huang, Ru .
IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2022, 30 (11) :1631-1641
[39]   Minimally buffered deflection router for spiking neural network hardware implementations [J].
Liu, Junxiu ;
Jiang, Dong ;
Luo, Yuling ;
Qiu, Senhui ;
Huang, Yongchuang .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (18) :11753-11764
[40]   Gaussian and exponential lateral connectivity on distributed spiking neural network simulation [J].
Pastorelli, Elena ;
Paolucci, Pier Stanislao ;
Simula, Francesco ;
Biagioni, Andrea ;
Capuani, Fabrizio ;
Cretaro, Paolo ;
De Bonis, Giulia ;
Lo Cicero, Francesca ;
Lonardo, Alessandro ;
Martinelli, Michele ;
Pontisso, Luca ;
Vicini, Piero ;
Ammendola, Roberto .
2018 26TH EUROMICRO INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED, AND NETWORK-BASED PROCESSING (PDP 2018), 2018, :658-665