A Memcapacitive Spiking Neural Network with Circuit Nonlinearity-aware Training

被引:5
|
作者
Oshio, Reon [1 ]
Sugahara, Takuya [1 ]
Sawada, Atsushi [1 ]
Kimura, Mutsumi [1 ,2 ]
Zhang, Renyuan [1 ]
Nakashima, Yasuhiko [1 ]
机构
[1] Nara Inst Sci & Technol NAIST, Grad Sch Sci & Technol, Ikoma, Nara, Japan
[2] Ryukoku Univ, Grad Sch Sci & Technol, Otsu, Shiga, Japan
来源
IEEE SYMPOSIUM ON LOW-POWER AND HIGH-SPEED CHIPS AND SYSTEMS (2022 IEEE COOL CHIPS 25) | 2022年
关键词
Neuromorphic Computing; Spiking Neural Network (SNN); Processing-In-Memory (PIM); Analog Neuron Circuit; Memcapacitor; Hardware/Software Co-design;
D O I
10.1109/COOLCHIPS54332.2022.9772674
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Neuromorphic computing is an unconventional computing scheme that executes computable algorithms using Spiking Neural Networks (SNNs) mimicking neural dynamics with high speed and low power consumption by the dedicated hardware. The analog implementation of neuromorphic computing has been studied in the field of edge computing etc. and is considered to be superior to the digital implementation in terms of power consumption. Furthermore, It is expected to have extremely low power consumption that Processing-In-Memory (PIM) based synaptic operations using non-volatile memory (NVM) devices for both weight memory and multiply-accumulate operations. However, unintended non-linearities and hysteresis occur when attempting to implement analog spiking neuron circuits as simply as possible. As a result, it is thought to cause accuracy loss when inference is performed by mapping the weight parameters of the SNNs which trained offline to the element parameters of the NVM. In this study, we newly designed neuromorphic hardware operating at 100 MHz that employs memcapacitor as a synaptic element, which is expected to have ultra-low power consumption. We also propose a method for training SNNs that incorporate the nonlinearity of the designed circuit into the neuron model and convert the synaptic weights into circuit element parameters. The proposed training method can reduce the degradation of accuracy even for very simple neuron circuits. The proposed circuit and method classify MNIST with similar to 33.88 nJ/Inference, excluding the encoder, with similar to 97% accuracy. The circuit design and measurement of circuit characteristics were performed in Rohm 180nm process using HSPICE. A spiking neuron model that incorporates circuit non-linearity as an activation function was implemented in PyTorch, a machine learning framework for Python.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Spiking activity in a memcapacitive and memristive emulator-based bionic circuit
    Xu, Quan
    Ding, Xincheng
    Wang, Ning
    Chen, Bei
    Parastesh, Fatemeh
    Chen, Mo
    CHAOS SOLITONS & FRACTALS, 2024, 187
  • [2] Neuroevolution Guided Hybrid Spiking Neural Network Training
    Lu, Sen
    Sengupta, Abhronil
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [3] SATA: Sparsity-Aware Training Accelerator for Spiking Neural Networks
    Yin, Ruokai
    Moitra, Abhishek
    Bhattacharjee, Abhiroop
    Kim, Youngeun
    Panda, Priyadarshini
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2023, 42 (06) : 1926 - 1938
  • [4] Training multi-bit Spiking Neural Network with Virtual Neurons
    Xu, Haoran
    Gu, Zonghua
    Sun, Ruimin
    Ma, De
    NEUROCOMPUTING, 2025, 634
  • [5] Neurogenesis Dynamics-inspired Spiking Neural Network Training Acceleration
    Huang, Shaoyi
    Fang, Haowen
    Mahmood, Kaleel
    Lei, Bowen
    Xu, Nuo
    Lei, Bin
    Sun, Yue
    Xu, Dongkuan
    Wen, Wujie
    Ding, Caiwen
    2023 60TH ACM/IEEE DESIGN AUTOMATION CONFERENCE, DAC, 2023,
  • [6] SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training
    Liu, Fangxin
    Zhao, Wenbo
    Chen, Yongbiao
    Wang, Zongwu
    Yang, Tao
    Jiang, Li
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [7] TT-SNN: Tensor Train Decomposition for Efficient Spiking Neural Network Training
    Lee, Donghyun
    Yin, Ruokai
    Kim, Youngeun
    Moitra, Abhishek
    Li, Yuhang
    Panda, Priyadarshini
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,
  • [8] StereoSpike: Depth Learning With a Spiking Neural Network
    Rancon, Ulysse
    Cuadrado-Anibarro, Javier
    Cottereau, Benoit R.
    Masquelier, Timothee
    IEEE ACCESS, 2022, 10 : 127428 - 127439
  • [9] EXODUS: Stable and efficient training of spiking neural networks
    Bauer, Felix C.
    Lenz, Gregor
    Haghighatshoar, Saeid
    Sheik, Sadique
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [10] Thermal-Aware Compilation of Spiking Neural Networks to Neuromorphic Hardware
    Titirsha, Twisha
    Das, Anup
    LANGUAGES AND COMPILERS FOR PARALLEL COMPUTING, LCPC 2020, 2022, 13149 : 134 - 150