ReAFM: A Reconfigurable Nonlinear Activation Function Module for Neural Networks

被引:2
|
作者
Wu, Xiao [1 ]
Liang, Shuang [1 ]
Wang, Meiqi [1 ]
Wang, Zhongfeng [1 ]
机构
[1] Nanjing Univ, Sch Elect Sci & Engn, Nanjing 210093, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep neural network; activation functions; FPGA; VLSI; hardware architecture; HARDWARE IMPLEMENTATION;
D O I
10.1109/TCSII.2023.3241487
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep neural networks (DNNs) with various nonlinear activation functions (NAFs) have achieved unprecedented successes, sparking interest in efficient DNN hardware implementation. However, most existing NAF implementations focus on one type of functions with dedicated architectures, which is not suited for supporting versatile DNN accelerators. In this brief, based on a proposed reciprocal approximation optimization (RAO) method, an efficient reconfigurable nonlinear activation function module (ReAFM) is devised to implement various NAFs. The computational logic and dataflow of certain NAFs are merged and reused to minimize hardware consumption by leveraging the correlations among different NAFs. In addition, a precision adjustable exponential unit (PAEU) is developed to obtain a good tradeoff between the approximation accuracy and hardware cost. Compared to the prior art, the experimental results demonstrate that the proposed ReAFM can support many more NAF types with comparable or even better performance. Furthermore, evaluation results on some prevalent neural networks show that the proposed approximation method causes negligible accuracy loss (< 0.1%).
引用
收藏
页码:2660 / 2664
页数:5
相关论文
共 50 条
  • [31] Spatio-Temporal Optimization of Deep Neural Networks for Reconfigurable FPGA SoCs
    Seyoum, Biruk
    Pagani, Marco
    Biondi, Alessandro
    Balleri, Sara
    Buttazzo, Giorgio
    IEEE TRANSACTIONS ON COMPUTERS, 2021, 70 (11) : 1988 - 2000
  • [32] SoC Reconfigurable Architecture for Implementing Software Trained Recurrent Neural Networks on FPGA
    Wasef, Michael
    Rafla, Nader
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2023, 70 (06) : 2497 - 2510
  • [33] Self-reconfigurable multi-layer neural networks with genetic algorithms
    Sugawara, E
    Fukushi, M
    Horiguchi, S
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2004, E87D (08): : 2021 - 2028
  • [34] Piecewise Polynomial Activation Functions for Feedforward Neural Networks
    Ezequiel López-Rubio
    Francisco Ortega-Zamorano
    Enrique Domínguez
    José Muñoz-Pérez
    Neural Processing Letters, 2019, 50 : 121 - 147
  • [35] Piecewise Polynomial Activation Functions for Feedforward Neural Networks
    Lopez-Rubio, Ezequiel
    Ortega-Zamorano, Francisco
    Dominguez, Enrique
    Munoz-Perez, Jose
    NEURAL PROCESSING LETTERS, 2019, 50 (01) : 121 - 147
  • [36] Novel neuronal activation functions for feedforward neural networks
    Efe, Mehmet Oender
    NEURAL PROCESSING LETTERS, 2008, 28 (02) : 63 - 79
  • [37] Learning Activation Functions in Deep (Spline) Neural Networks
    Bohra, Pakshal
    Campos, Joaquim
    Gupta, Harshit
    Aziznejad, Shayan
    Unser, Michael
    IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 295 - 309
  • [38] Stochastic Selection of Activation Layers for Convolutional Neural Networks
    Nanni, Loris
    Lumini, Alessandra
    Ghidoni, Stefano
    Maguolo, Gianluca
    SENSORS, 2020, 20 (06)
  • [39] Novel Neuronal Activation Functions for Feedforward Neural Networks
    Mehmet Önder Efe
    Neural Processing Letters, 2008, 28 : 63 - 79
  • [40] Low-Voltage Realization of Neural Networks using Non-Monotonic Activation Function for Digital Applications
    Khanday, Farooq Ahmad
    Kant, Nasir Ali
    Dar, Mohammad Rafiq
    RECENT ADVANCES IN ELECTRICAL & ELECTRONIC ENGINEERING, 2018, 11 (03) : 367 - 375