Learnable Leaky ReLU (LeLeLU): An Alternative Accuracy-Optimized Activation Function

被引:44
作者
Maniatopoulos, Andreas [1 ]
Mitianoudis, Nikolaos [1 ]
机构
[1] Democritus Univ Greece, Dept Elect & Comp Engn, Xanthi 67100, Greece
关键词
activation function; ReLU family; activation function test;
D O I
10.3390/info12120513
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In neural networks, a vital component in the learning and inference process is the activation function. There are many different approaches, but only nonlinear activation functions allow such networks to compute non-trivial problems by using only a small number of nodes, and such activation functions are called nonlinearities. With the emergence of deep learning, the need for competent activation functions that can enable or expedite learning in deeper layers has emerged. In this paper, we propose a novel activation function, combining many features of successful activation functions, achieving 2.53% higher accuracy than the industry standard ReLU in a variety of test cases.
引用
收藏
页数:16
相关论文
共 23 条
[1]  
[Anonymous], 2010, ICML
[2]  
Bingham G., 2021, ARXIV200603179V4
[3]  
BURGIN MS, 1982, DOKL AKAD NAUK SSSR+, V264, P19
[4]  
Clevert D.-A., 2016, 4 INT C LEARN REPR I
[5]  
Courbariaux M, 2015, ADV NEUR IN, V28
[6]  
Dugas C, 2001, ADV NEUR IN, V13, P472
[7]  
Glorot X., 2011, 14 INT C ARTIF INTEL, P315
[8]  
He K, 2015, IEEE I CONF COMP VIS, DOI DOI 10.1109/ICCV.2015.123
[9]  
Hendrycks D., 2016, CoRR
[10]  
Ioffe S, 2015, PR MACH LEARN RES, V37, P448