Low-Power Hardware Implementation for Parametric Rectified Linear Unit Function

被引:1
作者
Wu, Yu-Hsuan [1 ]
Lin, Wei-Hung [1 ]
Huang, Shih-Hsu [1 ]
机构
[1] Chung Yuan Christian Univ, Dept Elect Engn, Taoyuan, Taiwan
来源
2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN) | 2020年
关键词
Activation Function; Approximate Multiplier; Digital Circuit; Logic Design; Low Power;
D O I
10.1109/icce-taiwan49838.2020.9258135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
PReLU (Parametric ReLU) function is an advanced ReLU (Rectified Linear Unit) for improving the accuracy of the neural network However, up to now, there is no special attention to the hardware design of PReLU function. Based on the characteristics of PReLU function, in this paper, we propose a dedicated multiplier for PReLU function. With this dedicated multiplier, we develop a low-power PReLU hardware implementation. Experimental results show that the proposed approach can significantly reduce power consumption with small accuracy loss.
引用
收藏
页数:2
相关论文
共 3 条
  • [1] [Anonymous], 2012, 25 INT C NEURAL INFO
  • [2] [Anonymous], 2015, IEEE I CONF COMP VIS, DOI DOI 10.1109/ICCV.2015.123
  • [3] Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network
    Ledig, Christian
    Theis, Lucas
    Huszar, Ferenc
    Caballero, Jose
    Cunningham, Andrew
    Acosta, Alejandro
    Aitken, Andrew
    Tejani, Alykhan
    Totz, Johannes
    Wang, Zehan
    Shi, Wenzhe
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 105 - 114