Analysis of non-linear activation functions for classification tasks using convolutional neural networks

被引:9
|
作者
Dureja A. [1 ]
Pahwa P. [2 ]
机构
[1] Computer Science & Engineering, USICT, GGSIPU, New Delhi
[2] Computer Science & Engineering, BPIT, Rohini, New Delhi
关键词
Activation function; CNN; Deep neural networks; Hidden layers; Machine learning; Non-linear problems;
D O I
10.2174/2213275911666181025143029
中图分类号
学科分类号
摘要
Background: In making the deep neural network, activation functions play an important role. But the choice of activation functions also affects the network in term of optimization and to retrieve the better results. Several activation functions have been introduced in machine learning for many practical applications. But which activation function should use at hidden layer of deep neural networks was not identified. Objective: The primary objective of this analysis was to describe which activation function must be used at hidden layers for deep neural networks to solve complex non-linear problems. Methods: The configuration for this comparative model was used by using the datasets of 2 classes (Cat/Dog). The number of Convolutional layer used in this network was 3 and the pooling layer was also introduced after each layer of CNN layer. The total of the dataset was divided into the two parts. The first 8000 images were mainly used for training the network and the next 2000 images were used for testing the network. Results: The experimental comparison was done by analyzing the network by taking different activation functions on each layer of CNN network. The validation error and accuracy on Cat/Dog dataset were analyzed using activation functions (ReLU, Tanh, Selu, PRelu, Elu) at number of hidden layers. Overall the Relu gave best performance with the validation loss at 25th Epoch 0.3912 and validation accuracy at 25th Epoch 0.8320. Conclusion: It is found that a CNN model with ReLU hidden layers (3 hidden layers here) gives best results and improve overall performance better in term of accuracy and speed. These advantages of ReLU in CNN at number of hidden layers are helpful to effectively and fast retrieval of images from the databases. © 2019 Bentham Science Publishers.
引用
收藏
页码:156 / 161
页数:5
相关论文
共 50 条
  • [21] Audio classification using braided convolutional neural networks
    Sinha, Harsh
    Awasthi, Vinayak
    Ajmera, Pawan K.
    IET SIGNAL PROCESSING, 2020, 14 (07) : 448 - 454
  • [22] Classification of lung sounds using convolutional neural networks
    Murat Aykanat
    Özkan Kılıç
    Bahar Kurt
    Sevgi Saryal
    EURASIP Journal on Image and Video Processing, 2017
  • [23] Fruit Image Classification Using Convolutional Neural Networks
    Ashraf, Shawon
    Kadery, Ivan
    Chowdhury, Md Abdul Ahad
    Mahbub, Tahsin Zahin
    Rahman, Rashedur M.
    INTERNATIONAL JOURNAL OF SOFTWARE INNOVATION, 2019, 7 (04) : 51 - 70
  • [24] Recycling Material Classification using Convolutional Neural Networks
    Liu, Kaihua
    Liu, Xudong
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 83 - 88
  • [25] Deepfakes Classification of Faces Using Convolutional Neural Networks
    Sharma, Jatin
    Sharma, Sahil
    Kumar, Vijay
    Hussein, Hany S.
    Alshazly, Hammam
    TRAITEMENT DU SIGNAL, 2022, 39 (03) : 1027 - 1037
  • [26] UWB Channel Classification Using Convolutional Neural Networks
    ShirinAbadi, Parnian A.
    Abbasi, Arash
    2019 IEEE 10TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2019, : 1064 - 1068
  • [27] Classification of lung sounds using convolutional neural networks
    Aykanat, Murat
    Kilic, Ozkan
    Kurt, Bahar
    Saryal, Sevgi
    EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING, 2017,
  • [28] Mammogram Classification Schemes by Using Convolutional Neural Networks
    Soriano, Danny
    Aguilar, Carlos
    Ramirez-Morales, Ivan
    Tusa, Eduardo
    Rivas, Wilmer
    Pinta, Maritza
    TECHNOLOGY TRENDS, 2018, 798 : 71 - 85
  • [29] Automatic Pollen Classification Using Convolutional Neural Networks
    Boldeanu, Mihai
    Cucu, Horia
    Burileanu, Corneliu
    Marmureanu, Luminita
    2021 44TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS AND SIGNAL PROCESSING (TSP), 2021, : 130 - 133
  • [30] ATA: Attentional Non-Linear Activation Function Approximation for VLSI-Based Neural Networks
    Wei, Linyu
    Cai, Jueping
    Wang, Wuzhuang
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 793 - 797