ReAFM: A Reconfigurable Nonlinear Activation Function Module for Neural Networks

被引:2
|
作者
Wu, Xiao [1 ]
Liang, Shuang [1 ]
Wang, Meiqi [1 ]
Wang, Zhongfeng [1 ]
机构
[1] Nanjing Univ, Sch Elect Sci & Engn, Nanjing 210093, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep neural network; activation functions; FPGA; VLSI; hardware architecture; HARDWARE IMPLEMENTATION;
D O I
10.1109/TCSII.2023.3241487
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep neural networks (DNNs) with various nonlinear activation functions (NAFs) have achieved unprecedented successes, sparking interest in efficient DNN hardware implementation. However, most existing NAF implementations focus on one type of functions with dedicated architectures, which is not suited for supporting versatile DNN accelerators. In this brief, based on a proposed reciprocal approximation optimization (RAO) method, an efficient reconfigurable nonlinear activation function module (ReAFM) is devised to implement various NAFs. The computational logic and dataflow of certain NAFs are merged and reused to minimize hardware consumption by leveraging the correlations among different NAFs. In addition, a precision adjustable exponential unit (PAEU) is developed to obtain a good tradeoff between the approximation accuracy and hardware cost. Compared to the prior art, the experimental results demonstrate that the proposed ReAFM can support many more NAF types with comparable or even better performance. Furthermore, evaluation results on some prevalent neural networks show that the proposed approximation method causes negligible accuracy loss (< 0.1%).
引用
收藏
页码:2660 / 2664
页数:5
相关论文
共 50 条
  • [1] Analog-to-Digital Conversion With Reconfigurable Function Mapping for Neural Networks Activation Function Acceleration
    Giordano, Massimo
    Cristiano, Giorgio
    Ishibashi, Koji
    Ambrogio, Stefano
    Tsai, Hsinyu
    Burr, Geoffrey W.
    Narayanan, Pritish
    IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2019, 9 (02) : 367 - 376
  • [2] A reconfigurable multi-precision quantization-aware nonlinear activation function hardware module for DNNs
    Hong, Qi
    Liu, Zhiming
    Long, Qiang
    Tong, Hao
    Zhang, Tianxu
    Zhu, Xiaowen
    Zhao, Yunong
    Ru, Hua
    Zha, Yuxing
    Zhou, Ziyuan
    Wu, Jiashun
    Tan, Hongtao
    Hong, Weiqiang
    Xu, Yaohua
    Guo, Xiaohui
    MICROELECTRONICS JOURNAL, 2024, 151
  • [3] KAF + RSigELU: a nonlinear and kernel-based activation function for deep neural networks
    Serhat Kiliçarslan
    Mete Celik
    Neural Computing and Applications, 2022, 34 : 13909 - 13923
  • [4] High accuracy FPGA activation function implementation for neural networks
    Hajduk, Zbigniew
    NEUROCOMPUTING, 2017, 247 : 59 - 61
  • [5] Reconfigurable FPGA implementation of neural networks
    Hajduk, Zbigniew
    NEUROCOMPUTING, 2018, 308 : 227 - 234
  • [6] KAF plus RSigELU: a nonlinear and kernel-based activation function for deep neural networks
    Kilicarslan, Serhat
    Celik, Mete
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (16) : 13909 - 13923
  • [7] FPGA Realization of Activation Function for Artificial Neural Networks
    Saichand, Venakata
    Nirmala, Devi M.
    Arumugam., S.
    Mohankumar, N.
    ISDA 2008: EIGHTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 3, PROCEEDINGS, 2008, : 159 - 164
  • [8] Artificial Neural Networks Activation Function HDL Coder
    Namin, Ashkan Hosseinzadeh
    Leboeuf, Karl
    Wu, Huapeng
    Ahmadi, Majid
    2009 IEEE INTERNATIONAL CONFERENCE ON ELECTRO/INFORMATION TECHNOLOGY, 2009, : 387 - 390
  • [9] Nonlinear Activation Functions for Artificial Neural Networks Realized in Hardware
    Dlugosz, Zofia
    Dlugosz, Rafal
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE MIXED DESIGN OF INTEGRATED CIRCUITS AND SYSTEM (MIXDES 2018), 2018, : 381 - 384
  • [10] Efficient Implementation of Activation Function on FPGA for Accelerating Neural Networks
    Qian, Kai
    Liu, Yinqiu
    Zhang, Zexu
    Wang, Kun
    2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,