LOW-POWER BUILDING-BLOCK FOR ARTIFICIAL NEURAL NETWORKS

被引:9
|
作者
LEE, ST
LAU, KT
机构
[1] Microelectronics Centre, School of Electrical and Electronic Engineering, Nanyang Technological University, Nanyang Avenue
关键词
NEURAL NETWORKS; INTEGRATED CIRCUITS;
D O I
10.1049/el:19951138
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The authors propose and analyse a low-power CMOS building block for artificial neural networks (ANNs) that can function either as a synapse or a neuron. The design is based on the current-mode approach and uses the square-law characteristics of a MOS transistor working in saturation. The new building block uses I-V converters, a current-mirror and a +/- 1 V power supply to achieve superior performance, Modularity, ease of interconnectivity, expandability and reconfigurability are the advantages of this building block.
引用
收藏
页码:1618 / 1619
页数:2
相关论文
共 50 条
  • [1] A low-power synapse/neuron cell for artificial neural networks
    Lau, KT
    Lee, ST
    Chan, PK
    MICROELECTRONICS JOURNAL, 1999, 30 (12) : 1261 - 1264
  • [2] Low-power synapse/neuron cell for artificial neural networks
    Division of Circuits and Systems, Sch. Elec. Electron. Eng., N., Singapore, Singapore
    不详
    Microelectron J, 12 (1261-1264):
  • [3] Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware
    Diehl, Peter U.
    Zarrella, Guido
    Cassidy, Andrew
    Pedroni, Bruno U.
    Neftci, Emre
    2016 IEEE INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC), 2016,
  • [4] A GENERIC SYSTOLIC ARRAY BUILDING-BLOCK FOR NEURAL NETWORKS WITH ON-CHIP LEARNING
    LEHMANN, C
    VIREDAZ, M
    BLAYO, F
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1993, 4 (03): : 400 - 407
  • [5] A Low-Power RRAM Memory Block for Embedded, Multi-Level Weight and Bias Storage in Artificial Neural Networks
    Pechmann, Stefan
    Mai, Timo
    Potschka, Julian
    Reiser, Daniel
    Reichel, Peter
    Breiling, Marco
    Reichenbach, Marc
    Hagelauer, Amelie
    MICROMACHINES, 2021, 12 (11)
  • [6] A novel low-power building block CMOS cell for adders
    Shams, AM
    Bayoumi, MA
    ISCAS '98 - PROCEEDINGS OF THE 1998 INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-6, 1998, : A153 - A156
  • [7] Hypercolumn Sparsification for Low-Power Convolutional Neural Networks
    Pilly, Praveen K.
    Stepp, Nigel D.
    Liapis, Yannis
    Payton, David W.
    Srinivasa, Narayan
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2019, 15 (02)
  • [8] QUENN: QUantization Engine for low-power Neural Networks
    de Prado, Miguel
    Denna, Maurizio
    Benini, Luca
    Pazos, Nuria
    2018 ACM INTERNATIONAL CONFERENCE ON COMPUTING FRONTIERS, 2018, : 36 - 44
  • [9] A low-power analog implementation of cellular neural networks
    Anguita, M
    Pelayo, FJ
    Fernandez, FJ
    Prieto, A
    FROM NATURAL TO ARTIFICIAL NEURAL COMPUTATION, 1995, 930 : 736 - 743
  • [10] Teaching of first course on power electronics: A building-block approach
    Mohan, N
    2001 IEEE POWER ENGINEERING SOCIETY WINTER MEETING, CONFERENCE PROCEEDINGS, VOLS 1-3, 2001, : 854 - 855