Biased ReLU neural networks

被引:32
|
作者
Liang, XingLong [1 ]
Xu, Jun [1 ]
机构
[1] Harbin Inst Technol, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Biased ReLU; Neural network; PWL network flexibility; ADAPTIVE HINGING HYPERPLANES;
D O I
10.1016/j.neucom.2020.09.050
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks (NN) with rectified linear units (ReLU) have been widely implemented since 2012. In this paper, we describe an activation function called the biased ReLU neuron (BReLU), which is similar to the ReLU. Based on this activation function, we propose the BReLU NN (BRNN). The structure of the BRNN is similar to that of the ReLU network. However, the difference between the two is that the BReLU introduces several biases for each input variable. This allows the BRNN to divide the input space into a greater number of linear regions and improve network flexibility. The BRNN parameters to be estimated are the weight matrices and the bias parameters of the BReLU neurons. The weights are obtained using the backpropagation method. Moreover, we propose a method to compute the bias parameters of the BReLU neurons. In this method, batch normalization is applied to the BRNN, and the variance and mean of the input variables are obtained. Based on these two parameters, the bias parameters are estimated. In addition, we investigate the flexibility of the BRNN. Specifically, we study the number of linear regions and provide the upper bound for the maximum number of linear regions. The results indicate that for the same input dimension, the BRNN divides the input space into a greater number of linear regions than the ReLU network. This explains to a certain extent why the BRNN has the superior approximation ability. Experiments are carried out using five datasets, and the results verify the effectiveness of the proposed method. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:71 / 79
页数:9
相关论文
共 50 条
  • [1] On the Error Bounds for ReLU Neural Networks
    Katende, Ronald
    Kasumba, Henry
    Kakuba, Godwin
    Mango, John
    IAENG International Journal of Applied Mathematics, 2024, 54 (12) : 2602 - 2611
  • [2] Advances in verification of ReLU neural networks
    Ansgar Rössig
    Milena Petkovic
    Journal of Global Optimization, 2021, 81 : 109 - 152
  • [3] Pathwise Explanation of ReLU Neural Networks
    Lim, Seongwoo
    Jo, Won
    Lee, Joohyung
    Choi, Jaesik
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [4] Advances in verification of ReLU neural networks
    Rossig, Ansgar
    Petkovic, Milena
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 81 (01) : 109 - 152
  • [5] Random Sketching for Neural Networks With ReLU
    Wang, Di
    Zeng, Jinshan
    Lin, Shao-Bo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (02) : 748 - 762
  • [6] Building Energy Optimization Based on Biased ReLU Neural Network
    Li, Hongyi
    Liang, Xinglong
    Xu, Jun
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 5933 - 5938
  • [7] Approximating Model Predictive Controller With Biased ReLU Neural Network
    Wang, Kai
    Liang, Xinglong
    Xu, Jun
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 3780 - 3785
  • [8] Optimal function approximation with ReLU neural networks
    Liu, Bo
    Liang, Yi
    NEUROCOMPUTING, 2021, 435 : 216 - 227
  • [9] Locally linear attributes of ReLU neural networks
    Sattelberg, Ben
    Cavalieri, Renzo
    Kirby, Michael
    Peterson, Chris
    Beveridge, Ross
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2023, 6
  • [10] A multivariate Riesz basis of ReLU neural networks
    Schneider, Cornelia
    Vybiral, Jan
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2024, 68