Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function

被引:105
作者
Zamanlooy, Babak [1 ]
Mirhassani, Mitra [1 ]
机构
[1] Univ Windsor, Dept Elect & Comp Engn, Windsor, ON N9B 3P4, Canada
关键词
Hyperbolic tangent; neural networks; nonlinear activation function; VLSI implementation; SIGMOID FUNCTION; HARDWARE IMPLEMENTATION; GENERATORS; DESIGN;
D O I
10.1109/TVLSI.2012.2232321
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions. Accurate implementation of these transfer functions in digital networks faces certain challenges. In this paper, an efficient approximation scheme for hyperbolic tangent function is proposed. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. Hardware implementation of the proposed approximation scheme is presented, which shows that the proposed structure compares favorably with previous architectures in terms of area and delay. The proposed structure requires less output bits for the same maximum allowable error when compared to the state-of-the-art. The number of output bits of the activation function determines the bit width of multipliers and adders in the network. Therefore, the proposed activation function results in reduction in area, delay, and power in VLSI implementation of artificial neural networks with hyperbolic tangent activation function.
引用
收藏
页码:39 / 48
页数:10
相关论文
共 50 条
  • [31] Reconfigurable Communication Fabric for Efficient Implementation of Neural Networks
    Firuzan, Arash
    Modarressi, Mehdi
    Daneshtalab, Masoud
    [J]. 2015 10TH INTERNATIONAL SYMPOSIUM ON RECONFIGURABLE COMMUNICATION-CENTRIC SYSTEMS-ON-CHIP (RECOSOC), 2015,
  • [32] Low complexity VLSI implementation of CORDIC-based exponent calculation for neural networks
    Aggarwal, Supriya
    Khare, Kavita
    [J]. INTERNATIONAL JOURNAL OF ELECTRONICS, 2012, 99 (11) : 1471 - 1488
  • [33] The research of HPA mathematical model based on hyperbolic tangent function
    Li, Xu
    Wei, Qiu
    Yang, Liu
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTS & INTELLIGENT SYSTEM (ICRIS 2019), 2019, : 535 - 538
  • [34] Parametrized Half-Hyperbolic Tangent Function-Activated Complex-Valued Neural Network Approximation
    Anastassiou, George A.
    Karateke, Seda
    [J]. SYMMETRY-BASEL, 2024, 16 (12):
  • [35] Clock Gating-Based Effectual Realization of Stochastic Hyperbolic Tangent Function for Deep Neural Hardware Accelerators
    Gunjan Rajput
    V. Logashree
    Kunika Naresh Biyani
    Santosh Kumar Vishvakarma
    [J]. Circuits, Systems, and Signal Processing, 2023, 42 : 5978 - 6000
  • [36] Enhancement of neural networks with an alternative activation function tanhLU
    Shen, Shui-Long
    Zhang, Ning
    Zhou, Annan
    Yin, Zhen-Yu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 199
  • [37] An efficient hardware implementation of feed-forward neural networks
    Szabó, T
    Horváth, G
    [J]. APPLIED INTELLIGENCE, 2004, 21 (02) : 143 - 158
  • [38] Flash Memory Array for Efficient Implementation of Deep Neural Networks
    Han, Runze
    Xiang, Yachen
    Huang, Peng
    Shan, Yihao
    Liu, Xiaoyan
    Kang, Jinfeng
    [J]. ADVANCED INTELLIGENT SYSTEMS, 2021, 3 (05)
  • [39] An Efficient Hardware Implementation of Feed-Forward Neural Networks
    Tamás Szab#x00F3;
    Gábor Horv#x00E1;th
    [J]. Applied Intelligence, 2004, 21 : 143 - 158
  • [40] RSigELU: A nonlinear activation function for deep neural networks
    Kilicarslan, Serhat
    Celik, Mete
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 174 (174)