ReAFM: A Reconfigurable Nonlinear Activation Function Module for Neural Networks

被引:2
|
作者
Wu, Xiao [1 ]
Liang, Shuang [1 ]
Wang, Meiqi [1 ]
Wang, Zhongfeng [1 ]
机构
[1] Nanjing Univ, Sch Elect Sci & Engn, Nanjing 210093, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep neural network; activation functions; FPGA; VLSI; hardware architecture; HARDWARE IMPLEMENTATION;
D O I
10.1109/TCSII.2023.3241487
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep neural networks (DNNs) with various nonlinear activation functions (NAFs) have achieved unprecedented successes, sparking interest in efficient DNN hardware implementation. However, most existing NAF implementations focus on one type of functions with dedicated architectures, which is not suited for supporting versatile DNN accelerators. In this brief, based on a proposed reciprocal approximation optimization (RAO) method, an efficient reconfigurable nonlinear activation function module (ReAFM) is devised to implement various NAFs. The computational logic and dataflow of certain NAFs are merged and reused to minimize hardware consumption by leveraging the correlations among different NAFs. In addition, a precision adjustable exponential unit (PAEU) is developed to obtain a good tradeoff between the approximation accuracy and hardware cost. Compared to the prior art, the experimental results demonstrate that the proposed ReAFM can support many more NAF types with comparable or even better performance. Furthermore, evaluation results on some prevalent neural networks show that the proposed approximation method causes negligible accuracy loss (< 0.1%).
引用
收藏
页码:2660 / 2664
页数:5
相关论文
共 50 条
  • [41] Nonlinear Variable Selection via Deep Neural Networks
    Chen, Yao
    Gao, Qingyi
    Liang, Faming
    Wang, Xiao
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2021, 30 (02) : 484 - 492
  • [42] Nonlinear Dictionary Learning Based Deep Neural Networks
    Zhang, Hui
    Liu, Huaping
    Song, Rui
    Sun, Fuchun
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 3771 - 3776
  • [43] Maximum Likelihood Nonlinear Transformations Based on Deep Neural Networks
    Cui, Xiaodong
    Goel, Vaibhava
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2016, 24 (11) : 2023 - 2031
  • [44] Activation Functions for Convolutional Neural Networks: Proposals and Experimental Study
    Vargas, Victor Manuel
    Gutierrez, Pedro Antonio
    Barbero-Gomez, Javier
    Hervas-Martinez, Cesar
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (03) : 1478 - 1488
  • [45] GRAPH-ADAPTIVE ACTIVATION FUNCTIONS FOR GRAPH NEURAL NETWORKS
    Iancu, Bianca
    Ruiz, Luana
    Ribeiro, Alejandro
    Isufi, Elvin
    PROCEEDINGS OF THE 2020 IEEE 30TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2020,
  • [46] Measuring Model Complexity of Neural Networks with Curve Activation Functions
    Hu, Xia
    Liu, Weiqing
    Bian, Jiang
    Pei, Jian
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1521 - 1531
  • [47] Complex-Valued Neural Networks With Nonparametric Activation Functions
    Scardapane, Simone
    Van Vaerenbergh, Steven
    Hussain, Amir
    Uncini, Aurelio
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2020, 4 (02): : 140 - 150
  • [48] Quaternionic Convolutional Neural Networks with Trainable Bessel Activation Functions
    Nelson Vieira
    Complex Analysis and Operator Theory, 2023, 17
  • [49] Ensemble of convolutional neural networks trained with different activation functions
    Maguolo, Gianluca
    Nanni, Loris
    Ghidoni, Stefano
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 166
  • [50] Quaternionic Convolutional Neural Networks with Trainable Bessel Activation Functions
    Vieira, Nelson
    COMPLEX ANALYSIS AND OPERATOR THEORY, 2023, 17 (06)