Open Access   Article Go Back

Survey on Activation Functions in Convolution Neural Network

Sandeep Gond1 , Gurneet Bhamra2 , Jyoti Kharade3

Section:Survey Paper, Product Type: Journal Paper
Volume-7 , Issue-5 , Page no. 955-960, May-2019

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v7i5.955960

Online published on May 31, 2019

Copyright © Sandeep Gond, Gurneet Bhamra, Jyoti Kharade . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Sandeep Gond, Gurneet Bhamra, Jyoti Kharade, “Survey on Activation Functions in Convolution Neural Network,” International Journal of Computer Sciences and Engineering, Vol.7, Issue.5, pp.955-960, 2019.

MLA Style Citation: Sandeep Gond, Gurneet Bhamra, Jyoti Kharade "Survey on Activation Functions in Convolution Neural Network." International Journal of Computer Sciences and Engineering 7.5 (2019): 955-960.

APA Style Citation: Sandeep Gond, Gurneet Bhamra, Jyoti Kharade, (2019). Survey on Activation Functions in Convolution Neural Network. International Journal of Computer Sciences and Engineering, 7(5), 955-960.

BibTex Style Citation:
@article{Gond_2019,
author = {Sandeep Gond, Gurneet Bhamra, Jyoti Kharade},
title = {Survey on Activation Functions in Convolution Neural Network},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {5 2019},
volume = {7},
Issue = {5},
month = {5},
year = {2019},
issn = {2347-2693},
pages = {955-960},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=4345},
doi = {https://doi.org/10.26438/ijcse/v7i5.955960}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v7i5.955960}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=4345
TI - Survey on Activation Functions in Convolution Neural Network
T2 - International Journal of Computer Sciences and Engineering
AU - Sandeep Gond, Gurneet Bhamra, Jyoti Kharade
PY - 2019
DA - 2019/05/31
PB - IJCSE, Indore, INDIA
SP - 955-960
IS - 5
VL - 7
SN - 2347-2693
ER -

VIEWS PDF XML
494 281 downloads 127 downloads
  
  
           

Abstract

The Recognition of handwritten digits is helpful in various domains such as Banking(For Fraudery), Writer Recognition(In Criminal Suspicion), Autonomous cars(For reading and identifying speed limits and Numeric signs), License Plate readers(For parking structures/security cameras). Deep Learning which serves as a subfield of Machine Learning is used for the task of classification of images. Deep Learning makes use of neural networks to accomplish this task. Among these, the most suitable neural network that is used for image classification is known as Convolutional Neural Network. Convolutional Neural Networks are very similar to ordinary Neural Networks, they are made up of layers of neurons that have learnable weights along with Activation Functions and biases. Activation Functions are used to control the output of each neuron at every layer. In this paper, we have studied the role of Sigmoid and Relu(Rectified Linear Unit) Activation Functions in Convolutional Neural Network, and we compare among these which one provides the highest accuracy for the image classification task.

Key-Words / Index Term

Artificial Intelligence, convolutional neural networks, Deep learning, Activation function, Sigmoid, Relu

References

[1].http://www.academia.edu/28025198/Handwritten_Digit_Recognition_by_Combining_SVM_Classifiers
[2].Dr.J.Arunadevi and M.Devaki “The impact of Activation functions in Deep Neural net algorithm on Classification performance parameters”.
International Journal of Pure and Applied Mathematics.Volume 119 No. 12 2018, 16305-16312. ISSN: 1314-3395 (online version)
URL: http://www.ijpam.eu
[3].Dabal Pedamonti Comparison of non-linear activation functions for deep neural networks on MNIST classification task. arXiv:1804.02763v1 [cs.LG] 8 Apr 2018
[4] Serwa A, Studying the Effect of Activation Function on Classification Accuracy Using Deep Artificial Neural Networks, Journal of Remote Sensing & GIS 6: 203., July 2017.
[5] Yuriy Kochura, Sergii Stirenko, Yuri Gordienko, Comparative Performance Analysis of Neural Networks Architectures on H2O Platform for Various Activation Functions, 2017 IEEE International Young Scientists Forum on Applied Physics and Engineering YSF-2017.
[6].A.Bhattacharjee Cluster-Then-Predict and Predictive Algorithms (Logistic Regression) International Journal of Computer Sciences and Engineering. Volume-6, Issue-2 E-ISSN: 2347-2693