Stochastic Implementation of the Activation Function for Artificial Neural Networks

被引:0
作者
Yeo, Injune [1 ]
Gi, Sang-gyun [1 ]
Lee, Byung-geun [1 ]
Chu, Myonglae [2 ]
机构
[1] Gwangju Inst Sci & Technol, Sch Elect Engn & Comp Sci, Gwangju, South Korea
[2] IMEC, Interuniv Microelect Ctr, Imager SoC Team, Leuven, Belgium
来源
PROCEEDINGS OF 2016 IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE (BIOCAS) | 2016年
基金
新加坡国家研究基金会;
关键词
Artificial neural network; nonlinear activation function; neuromorphic; stochastic neuron; ananlog computing element;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One of the key elements in an artificial neural networks (ANNs) is the activation function (AF), that converts the weighted sum of a neuron's input into a probability of firing rate. The hardware implementation of the AF requires complicated circuits and involves a considerable amount of power dissipation. This renders the integration of a number of neurons onto a single chip difficult. This paper presents circuit techniques for realizing four different types of AFs, such as the step, identity, rectified-linear unit (ReLU), and the sigmoid, based on stochastic computing. The proposed AF circuits are simpler and consume considerably lesser power than the existing ones. A handwritten digit recognition system employing the AF circuits has been simulated for verifying the effectiveness of the techniques.
引用
收藏
页码:440 / 443
页数:4
相关论文
empty
未找到相关数据