A Memcapacitive Spiking Neural Network with Circuit Nonlinearity-aware Training

被引:5
作者
Oshio, Reon [1 ]
Sugahara, Takuya [1 ]
Sawada, Atsushi [1 ]
Kimura, Mutsumi [1 ,2 ]
Zhang, Renyuan [1 ]
Nakashima, Yasuhiko [1 ]
机构
[1] Nara Inst Sci & Technol NAIST, Grad Sch Sci & Technol, Ikoma, Nara, Japan
[2] Ryukoku Univ, Grad Sch Sci & Technol, Otsu, Shiga, Japan
来源
IEEE SYMPOSIUM ON LOW-POWER AND HIGH-SPEED CHIPS AND SYSTEMS (2022 IEEE COOL CHIPS 25) | 2022年
关键词
Neuromorphic Computing; Spiking Neural Network (SNN); Processing-In-Memory (PIM); Analog Neuron Circuit; Memcapacitor; Hardware/Software Co-design;
D O I
10.1109/COOLCHIPS54332.2022.9772674
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Neuromorphic computing is an unconventional computing scheme that executes computable algorithms using Spiking Neural Networks (SNNs) mimicking neural dynamics with high speed and low power consumption by the dedicated hardware. The analog implementation of neuromorphic computing has been studied in the field of edge computing etc. and is considered to be superior to the digital implementation in terms of power consumption. Furthermore, It is expected to have extremely low power consumption that Processing-In-Memory (PIM) based synaptic operations using non-volatile memory (NVM) devices for both weight memory and multiply-accumulate operations. However, unintended non-linearities and hysteresis occur when attempting to implement analog spiking neuron circuits as simply as possible. As a result, it is thought to cause accuracy loss when inference is performed by mapping the weight parameters of the SNNs which trained offline to the element parameters of the NVM. In this study, we newly designed neuromorphic hardware operating at 100 MHz that employs memcapacitor as a synaptic element, which is expected to have ultra-low power consumption. We also propose a method for training SNNs that incorporate the nonlinearity of the designed circuit into the neuron model and convert the synaptic weights into circuit element parameters. The proposed training method can reduce the degradation of accuracy even for very simple neuron circuits. The proposed circuit and method classify MNIST with similar to 33.88 nJ/Inference, excluding the encoder, with similar to 97% accuracy. The circuit design and measurement of circuit characteristics were performed in Rohm 180nm process using HSPICE. A spiking neuron model that incorporates circuit non-linearity as an activation function was implemented in PyTorch, a machine learning framework for Python.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Spiking Generative Adversarial Networks With a Neural Network Discriminator: Local Training, Bayesian Models, and Continual Meta-Learning
    Rosenfeld, Bleema
    Simeone, Osvaldo
    Rajendran, Bipin
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (11) : 2778 - 2791
  • [22] A Heterogeneous Spiking Neural Network for Unsupervised Learning of Spatiotemporal Patterns
    She, Xueyuan
    Dash, Saurabh
    Kim, Daehyun
    Mukhopadhyay, Saibal
    FRONTIERS IN NEUROSCIENCE, 2021, 14
  • [23] Time-frequency analysis using spiking neural network
    Bensimon, Moshe
    Hadad, Yakir
    Ben-Shimol, Yehuda
    Greenberg, Shlomo
    NEUROMORPHIC COMPUTING AND ENGINEERING, 2024, 4 (04):
  • [24] TRAINING PROBABILISTIC SPIKING NEURAL NETWORKS WITH FIRST-TO-SPIKE DECODING
    Bagheri, Alireza
    Simeone, Osvaldo
    Rajendran, Bipin
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 2986 - 2990
  • [25] An Analog Neuron Circuit for Spiking Convolutional Neural Networks Based on Flash Array
    Xiaofeng, Gu
    Yanhang, Liu
    Zhiguo, Yu
    Xiaoyu, Zhong
    Xuan, Chen
    Yi, Sun
    Hongbing, Pan
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2023, 45 (01) : 116 - 124
  • [26] Enabling Resource-Aware Mapping of Spiking Neural Networks via Spatial Decomposition
    Balaji, Adarsha
    Song, Shihao
    Das, Anup
    Krichmar, Jeffrey
    Dutt, Nikil
    Shackleford, James
    Kandasamy, Nagarajan
    Catthoor, Francky
    IEEE EMBEDDED SYSTEMS LETTERS, 2021, 13 (03) : 142 - 145
  • [27] Minimally buffered deflection router for spiking neural network hardware implementations
    Junxiu Liu
    Dong Jiang
    Yuling Luo
    Senhui Qiu
    Yongchuang Huang
    Neural Computing and Applications, 2021, 33 : 11753 - 11764
  • [28] Nonvolatile Memories in Spiking Neural Network Architectures: Current and Emerging Trends
    Varshika, M. Lakshmi
    Corradi, Federico
    Das, Anup
    ELECTRONICS, 2022, 11 (10)
  • [29] Encoding, Model, and Architecture: Systematic Optimization for Spiking Neural Network in FPGAs
    Fang, Haowen
    Mei, Zaidao
    Shrestha, Amar
    Zhao, Ziyi
    Li, Yilan
    Qiu, Qinru
    2020 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED-DESIGN (ICCAD), 2020,
  • [30] Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
    Parsa, Maryam
    Kulkarni, Shruti R.
    Coletti, Mark
    Bassett, Jeffrey
    Mitchell, J. Parker
    Schuman, Catherine D.
    2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, : 1225 - 1232