Study of the Effect of Combining Activation Functions in a Convolutional Neural Network

被引:6
作者
Guevara, M. [1 ]
Cruz, V [2 ]
Vergara, O. [2 ]
Nandayapa, M. [2 ]
Ochoa, H. [2 ]
Sossa, H. [3 ]
机构
[1] Univ Autonoma Ciudad Juarez, Programa Doctorado Ciencias Ingn Avanzada, Ciudad Juarez, Chihuahua, Mexico
[2] Univ Autonoma Ciudad Juarez, Ciudad Juarez, Chihuahua, Mexico
[3] Inst Politecn Nacl CIC IPN, Ciudad De Mexico, Mexico
关键词
Silicon compounds; Databases; Standards; IEEE transactions; Convolutional neural networks; Silicon; Neural networks; Convolutional Neural Networks; activation functions;
D O I
10.1109/TLA.2021.9448319
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional Neural Networks (CNNs) have proven to be an effective approach for solving image classification problems. The output, the accuracy and the computational efficiency of a CNN are determined mainly by the architecture, the convolutional filters, and the activation functions. Based on the importance of an activation function, in this paper, nine new activation functions based on combinations of classical functions such as ReLU and sigmoid are presented. Also, a study about the effects caused by the activation functions in the performance of a CNN is presented. First, every new function is described, also, their graphs, analytic forms and derivatives are presented. Then, a traditional CNN model with each new activation function is used to classify three 10-class databases: MNIST, Fashion MNIST and a handwritten digit database created by us. Experimental results illustrate that some of the proposed activation functions lead to better performances on classifying than classical activation functions. Moreover, our study demonstrated that the accuracy of a CNN could be increased by 1.18% with the new proposed activation functions.
引用
收藏
页码:844 / 852
页数:9
相关论文
共 18 条
[1]  
[Anonymous], 2019, ARXIV180308375V2 CSN
[2]   Linearized sigmoidal activation: A novel activation function with tractable non-linear characteristics to boost representation capability [J].
Bawa, Vivek Singh ;
Kumar, Vinay .
EXPERT SYSTEMS WITH APPLICATIONS, 2019, 120 :346-356
[3]  
Eger S, 2019, ARXIV PREPRINT ARXIV
[4]  
Guifang Lin, 2018, Procedia Computer Science, V131, P977, DOI 10.1016/j.procs.2018.04.239
[5]  
He K, P IEEE C COMP VIS PA, P770, DOI [DOI 10.1109/CVPR.2016.90, 10.1109/CVPR.2016.90]
[6]   Deep neural networks with Elastic Rectified Linear Units for object recognition [J].
Jiang, Xiaoheng ;
Pang, Yanwei ;
Li, Xuelong ;
Pan, Jing ;
Xie, Yinghong .
NEUROCOMPUTING, 2018, 275 :1132-1139
[7]   ImageNet Classification with Deep Convolutional Neural Networks [J].
Krizhevsky, Alex ;
Sutskever, Ilya ;
Hinton, Geoffrey E. .
COMMUNICATIONS OF THE ACM, 2017, 60 (06) :84-90
[8]   Backpropagation Applied to Handwritten Zip Code Recognition [J].
LeCun, Y. ;
Boser, B. ;
Denker, J. S. ;
Henderson, D. ;
Howard, R. E. ;
Hubbard, W. ;
Jackel, L. D. .
NEURAL COMPUTATION, 1989, 1 (04) :541-551
[9]   Gradient-based learning applied to document recognition [J].
Lecun, Y ;
Bottou, L ;
Bengio, Y ;
Haffner, P .
PROCEEDINGS OF THE IEEE, 1998, 86 (11) :2278-2324
[10]  
LeCun Y., 1989, Adv. Neural Inf. Process. Syst., V2, DOI DOI 10.5555/2969830.2969879