CSrram: Area-Efficient Low-Power Ex-Situ Training Framework for Memristive Neuromorphic Circuits Based on Clustered Sparsity

被引:6
作者
Fayyazi, Arash [1 ]
Kundu, Souvik [1 ]
Nazarian, Shahin [1 ]
Beerel, Peter A. [1 ]
Pedram, Massoud [1 ]
机构
[1] Univ Southern Calif, Ming Hsieh Dept Elect & Comp Engn, Los Angeles, CA 90089 USA
来源
2019 IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI (ISVLSI 2019) | 2019年
基金
美国国家科学基金会;
关键词
Clustered sparsity; memristive neuromorphic circuits; ex-situ training; Artificial neural networks (ANNs); low power circuits; NEURAL-NETWORKS; DEVICE;
D O I
10.1109/ISVLSI.2019.00090
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Artificial Neural Networks (ANNs) play a key role in many machine learning (ML) applications but pose arduous challenges in terms of storage and computation of network parameters. Memristive crossbar arrays (MCAs) are capable of both computation and storage, making them promising for in-memory computing enabled neural network accelerators. At the same time, the presence of a significant amount of zero weights in ANNs has motivated research in a variety of parameter reduction techniques. However, for crossbar based architectures, the study of efficient methods to take advantage of network sparsity is still in the early stage. This paper presents CSrram, an efficient ex-situ training framework for hybrid CMOS-memristive neuromorphic circuits. CSrram includes a pre-defined block diagonal clustered (BDC) sparsity algorithm to significantly reduce area and power consumption. The proposed framework is verified on a wide range of datasets including MNIST handwritten recognition, fashion MNIST, breast cancer prediction (BCW), IRIS, and mobile health monitoring. Compared to state of the art fully connected memristive neuromorphic circuits, our CSrram with only 25% density of weights in the first junction, provides a power and area efficiency of 1.5x and 2.6x (averaged over five datasets), respectively, without any significant test accuracy loss.
引用
收藏
页码:467 / 472
页数:6
相关论文
共 30 条
  • [1] A Circuit-Based Learning Architecture for Multilayer Neural Networks With Memristor Bridge Synapses
    Adhikari, Shyam Prasad
    Kim, Hyongsuk
    Budhathoki, Ram Kaji
    Yang, Changju
    Chua, Leon O.
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2015, 62 (01) : 215 - 223
  • [2] High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm
    Alibart, Fabien
    Gao, Ligang
    Hoskins, Brian D.
    Strukov, Dmitri B.
    [J]. NANOTECHNOLOGY, 2012, 23 (07)
  • [3] [Anonymous], ARXIV181201164
  • [4] [Anonymous], 2013, Proceedings of the 30th International Conference on Machine Learning, Cycle
  • [5] [Anonymous], 2018, P 55 ACM ESDA IEEE D
  • [6] [Anonymous], 2017, COMMUN ACM, DOI DOI 10.1145/3065386
  • [7] [Anonymous], ARXIV180601087
  • [8] [Anonymous], 2015, P 52 ANN DES AUT C
  • [9] PHAX: Physical Characteristics Aware Ex-Situ Training Framework for Inverter-Based Memristive Neuromorphic Circuits
    Ansari, Mohammad
    Fayyazi, Arash
    Banagozar, Ali
    Maleki, Mohammad Ali
    Kamal, Mehdi
    Afzali-Kusha, Ali
    Pedram, Massoud
    [J]. IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2018, 37 (08) : 1602 - 1613
  • [10] BanaGozar A, 2017, DES AUT TEST EUROPE, P440, DOI 10.23919/DATE.2017.7927030