Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function

被引:105
作者
Zamanlooy, Babak [1 ]
Mirhassani, Mitra [1 ]
机构
[1] Univ Windsor, Dept Elect & Comp Engn, Windsor, ON N9B 3P4, Canada
关键词
Hyperbolic tangent; neural networks; nonlinear activation function; VLSI implementation; SIGMOID FUNCTION; HARDWARE IMPLEMENTATION; GENERATORS; DESIGN;
D O I
10.1109/TVLSI.2012.2232321
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions. Accurate implementation of these transfer functions in digital networks faces certain challenges. In this paper, an efficient approximation scheme for hyperbolic tangent function is proposed. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. Hardware implementation of the proposed approximation scheme is presented, which shows that the proposed structure compares favorably with previous architectures in terms of area and delay. The proposed structure requires less output bits for the same maximum allowable error when compared to the state-of-the-art. The number of output bits of the activation function determines the bit width of multipliers and adders in the network. Therefore, the proposed activation function results in reduction in area, delay, and power in VLSI implementation of artificial neural networks with hyperbolic tangent activation function.
引用
收藏
页码:39 / 48
页数:10
相关论文
共 50 条
  • [21] VLSI implementation of artificial neural networks - A survey
    Nirmaladevi M.
    Arumugam S.
    International Journal of Modelling and Simulation, 2010, 30 (02) : 148 - 154
  • [22] The construction and approximation of feedforward neural network with hyperbolic tangent function
    Zhi-xiang Chen
    Fei-long Cao
    Applied Mathematics-A Journal of Chinese Universities, 2015, 30 : 151 - 162
  • [23] The construction and approximation of feedforward neural network with hyperbolic tangent function
    CHEN Zhi-xiang
    CAO Fei-long
    Applied Mathematics:A Journal of Chinese Universities, 2015, (02) : 151 - 162
  • [24] Performance Analysis of Table-Based Approximations of the Hyperbolic Tangent Activation Function
    Leboeuf, Karl
    Muscedere, Roberto
    Ahmadi, Majid
    2011 IEEE 54TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2011,
  • [25] Clock Gating-Based Effectual Realization of Stochastic Hyperbolic Tangent Function for Deep Neural Hardware Accelerators
    Rajput, Gunjan
    Logashree, V.
    Biyani, Kunika Naresh
    Vishvakarma, Santosh Kumar
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2023, 42 (10) : 5978 - 6000
  • [26] Artificial Neural Networks Activation Function HDL Coder
    Namin, Ashkan Hosseinzadeh
    Leboeuf, Karl
    Wu, Huapeng
    Ahmadi, Majid
    2009 IEEE INTERNATIONAL CONFERENCE ON ELECTRO/INFORMATION TECHNOLOGY, 2009, : 387 - 390
  • [27] Hyperbolic Tangent Activation Function on FIMT-DD Algorithm Analysis for Airline Big Data
    Wibisono, Ari
    Alhamidi, Machmud Roby
    Nurhadiyatna, Adi
    Jatmiko, Wisnu
    2017 INTERNATIONAL WORKSHOP ON BIG DATA AND INFORMATION SECURITY (IWBIS 2017), 2017, : 31 - 36
  • [28] A Novel Approximation Methodology and Its Efficient VLSI Implementation for the Sigmoid Function
    Qin, Zidi
    Qiu, Yuou
    Sun, Huaqing
    Lu, Zhonghai
    Wang, Zhongfeng
    Shen, Qinghong
    Pan, Hongbing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2020, 67 (12) : 3422 - 3426
  • [29] Logic Neural Networks for Efficient FPGA Implementation
    Ramirez, Ivan
    Garcia-Espinosa, Francisco J.
    Concha, David
    Aranda, Luis Alberto
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2024,
  • [30] Multistability of neural networks with discontinuous activation function
    Huang, Gan
    Cao, Jinde
    COMMUNICATIONS IN NONLINEAR SCIENCE AND NUMERICAL SIMULATION, 2008, 13 (10) : 2279 - 2289