Device Variation Effects on Neural Network Inference Accuracy in Analog In-Memory Computing Systems

被引:16
作者
Wang, Qiwen [1 ]
Park, Yongmo [1 ]
Lu, Wei D. [1 ]
机构
[1] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
基金
美国国家科学基金会;
关键词
analog computing; deep neural networks; emerging memory; in-memory computing; process-in-memory; RRAM; MEMRISTOR; NOISE;
D O I
10.1002/aisy.202100199
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In analog in-memory computing systems based on nonvolatile memories such as resistive random-access memory (RRAM), neural network models are often trained offline and then the weights are programmed onto memory devices as conductance values. The programmed weight values inevitably deviate from the target values during the programming process. This effect can be pronounced for emerging memories such as RRAM, PcRAM, and MRAM due to the stochastic nature during programming. Unlike noise, these weight deviations do not change during inference. The performance of neural network models is investigated against this programming variation under realistic system limitations, including limited device on/off ratios, memory array size, analog-to-digital converter (ADC) characteristics, and signed weight representations. Approaches to mitigate such device and circuit nonidealities through architecture-aware training are also evaluated. The effectiveness of variation injection during training to improve the inference robustness, as well as the effects of different neural network training parameters such as learning rate schedule, will be discussed.
引用
收藏
页数:12
相关论文
共 36 条
[1]   The effects of adding noise during backpropagation training on a generalization performance [J].
An, GZ .
NEURAL COMPUTATION, 1996, 8 (03) :643-674
[2]  
Boo Y., 2020, 2020 IEEE WORK SIGNA, P6
[3]   A fully integrated reprogrammable memristor-CMOS system for efficient multiply-accumulate operations [J].
Cai, Fuxi ;
Correll, Justin M. ;
Lee, Seung Hwan ;
Lim, Yong ;
Bothra, Vishishtha ;
Zhang, Zhengya ;
Flynn, Michael P. ;
Lu, Wei D. .
NATURE ELECTRONICS, 2019, 2 (07) :290-299
[4]  
Cai Y, 2013, DES AUT TEST EUROPE, P1285
[5]   Improving Adversarial Robustness via Guided Complement Entropy [J].
Chen, Hao-Yun ;
Liang, Jhao-Hong ;
Chang, Shih-Chieh ;
Pan, Jia-Yu ;
Chen, Yu-Ting ;
Wei, Wei ;
Juan, Da-Cheng .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, :4880-4888
[6]  
Clay R.D., 1992, JCNN INT JT C NEURAL, P769
[7]   Analog Neural Computing With Super-Resolution Memristor Crossbars [J].
James, Alex P. ;
Chua, Leon O. .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2021, 68 (11) :4470-4481
[8]   Parasitic Effect Analysis in Memristor-Array-Based Neuromorphic Systems [J].
Jeong, YeonJoo ;
Zidan, Mohammed A. ;
Lu, Wei D. .
IEEE TRANSACTIONS ON NANOTECHNOLOGY, 2018, 17 (01) :184-193
[9]  
Jonsson B.E., 2011, IMEKO TC4 INT WORKSH, P132
[10]   Accurate deep neural network inference using computational phase-change memory [J].
Joshi, Vinay ;
Le Gallo, Manuel ;
Haefeli, Simon ;
Boybat, Irem ;
Nandakumar, S. R. ;
Piveteau, Christophe ;
Dazzi, Martino ;
Rajendran, Bipin ;
Sebastian, Abu ;
Eleftheriou, Evangelos .
NATURE COMMUNICATIONS, 2020, 11 (01)