A Memcapacitive Spiking Neural Network with Circuit Nonlinearity-aware Training

被引:5
|
作者
Oshio, Reon [1 ]
Sugahara, Takuya [1 ]
Sawada, Atsushi [1 ]
Kimura, Mutsumi [1 ,2 ]
Zhang, Renyuan [1 ]
Nakashima, Yasuhiko [1 ]
机构
[1] Nara Inst Sci & Technol NAIST, Grad Sch Sci & Technol, Ikoma, Nara, Japan
[2] Ryukoku Univ, Grad Sch Sci & Technol, Otsu, Shiga, Japan
来源
IEEE SYMPOSIUM ON LOW-POWER AND HIGH-SPEED CHIPS AND SYSTEMS (2022 IEEE COOL CHIPS 25) | 2022年
关键词
Neuromorphic Computing; Spiking Neural Network (SNN); Processing-In-Memory (PIM); Analog Neuron Circuit; Memcapacitor; Hardware/Software Co-design;
D O I
10.1109/COOLCHIPS54332.2022.9772674
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Neuromorphic computing is an unconventional computing scheme that executes computable algorithms using Spiking Neural Networks (SNNs) mimicking neural dynamics with high speed and low power consumption by the dedicated hardware. The analog implementation of neuromorphic computing has been studied in the field of edge computing etc. and is considered to be superior to the digital implementation in terms of power consumption. Furthermore, It is expected to have extremely low power consumption that Processing-In-Memory (PIM) based synaptic operations using non-volatile memory (NVM) devices for both weight memory and multiply-accumulate operations. However, unintended non-linearities and hysteresis occur when attempting to implement analog spiking neuron circuits as simply as possible. As a result, it is thought to cause accuracy loss when inference is performed by mapping the weight parameters of the SNNs which trained offline to the element parameters of the NVM. In this study, we newly designed neuromorphic hardware operating at 100 MHz that employs memcapacitor as a synaptic element, which is expected to have ultra-low power consumption. We also propose a method for training SNNs that incorporate the nonlinearity of the designed circuit into the neuron model and convert the synaptic weights into circuit element parameters. The proposed training method can reduce the degradation of accuracy even for very simple neuron circuits. The proposed circuit and method classify MNIST with similar to 33.88 nJ/Inference, excluding the encoder, with similar to 97% accuracy. The circuit design and measurement of circuit characteristics were performed in Rohm 180nm process using HSPICE. A spiking neuron model that incorporates circuit non-linearity as an activation function was implemented in PyTorch, a machine learning framework for Python.
引用
收藏
页数:6
相关论文
共 50 条
  • [11] Endurance-Aware Mapping of Spiking Neural Networks to Neuromorphic Hardware
    Titirsha, Twisha
    Song, Shihao
    Das, Anup
    Krichmar, Jeffrey
    Dutt, Nikil
    Kandasamy, Nagarajan
    Catthoor, Francky
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (02) : 288 - 301
  • [12] Sparsity-Aware In-Memory Neuromorphic Computing Unit With Configurable Topology of Hybrid Spiking and Artificial Neural Network
    Liu, Ying
    Chen, Zhiyuan
    Zhao, Wentao
    Zhao, Tianhao
    Jia, Tianyu
    Wang, Zhixuan
    Huang, Ru
    Ye, Le
    Ma, Yufei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2024, 71 (06) : 2660 - 2673
  • [13] Efficient spiking neural network design via neural architecture search
    Yan, Jiaqi
    Liu, Qianhui
    Zhang, Malu
    Feng, Lang
    Ma, De
    Li, Haizhou
    Pan, Gang
    NEURAL NETWORKS, 2024, 173
  • [14] When Audio Denoising Meets Spiking Neural Network
    Hao, Xiang
    Ma, Chenxiang
    Yang, Qu
    Tan, Kay Chen
    Wu, Jibin
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 1524 - 1527
  • [15] A low-power charge-based integrate-and-fire circuit for binarized-spiking neural network
    Duong, Quang-Manh
    Trinh, Quang-Kien
    Nguyen, Van-Tinh
    Dao, Dinh-Ha
    Luong, Duy-Manh
    Hoang, Van-Phuc
    Lin, Longyang
    Deepu, John
    INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, 2023, 51 (07) : 3404 - 3414
  • [16] Surrogate gradient scaling for directly training spiking neural networks
    Tao Chen
    Shu Wang
    Yu Gong
    Lidan Wang
    Shukai Duan
    Applied Intelligence, 2023, 53 : 27966 - 27981
  • [17] Surrogate gradient scaling for directly training spiking neural networks
    Chen, Tao
    Wang, Shu
    Gong, Yu
    Wang, Lidan
    Duan, Shukai
    APPLIED INTELLIGENCE, 2023, 53 (23) : 27966 - 27981
  • [18] The Influence of Temporal Dependency on Training Algorithms for Spiking Neural Networks
    Wei, Yongtao
    Wang, Siqi
    Nait-Abdesselam, Farid
    Benlarbi-Delai, Aziz
    20TH INTERNATIONAL WIRELESS COMMUNICATIONS & MOBILE COMPUTING CONFERENCE, IWCMC 2024, 2024, : 1036 - 1041
  • [19] A Fully Memristive Spiking Neural Network with Unsupervised Learning
    Zhou, Peng
    Choi, Dong-Uk
    Eshraghian, Jason K.
    Kang, Sung-Mo
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 634 - 638
  • [20] A Heterogeneous Spiking Neural Network for Computationally Efficient Face Recognition
    Zhou, Xichuan
    Zhou, Zhenghua
    Zhong, Zhengqing
    Yu, Jianyi
    Wang, Tengxiao
    Tian, Min
    Jiang, Ying
    Shi, Cong
    2021 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2021,