Open Access   Article Go Back

A Comprehensive Study of Deep Learning Architectures, Applications and Tools

Nilay Ganatra1 , Atul Patel2

Section:Review Paper, Product Type: Journal Paper
Volume-6 , Issue-12 , Page no. 701-705, Dec-2018

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v6i12.701705

Online published on Dec 31, 2018

Copyright © Nilay Ganatra, Atul Patel . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Nilay Ganatra, Atul Patel, “A Comprehensive Study of Deep Learning Architectures, Applications and Tools,” International Journal of Computer Sciences and Engineering, Vol.6, Issue.12, pp.701-705, 2018.

MLA Style Citation: Nilay Ganatra, Atul Patel "A Comprehensive Study of Deep Learning Architectures, Applications and Tools." International Journal of Computer Sciences and Engineering 6.12 (2018): 701-705.

APA Style Citation: Nilay Ganatra, Atul Patel, (2018). A Comprehensive Study of Deep Learning Architectures, Applications and Tools. International Journal of Computer Sciences and Engineering, 6(12), 701-705.

BibTex Style Citation:
@article{Ganatra_2018,
author = {Nilay Ganatra, Atul Patel},
title = {A Comprehensive Study of Deep Learning Architectures, Applications and Tools},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {12 2018},
volume = {6},
Issue = {12},
month = {12},
year = {2018},
issn = {2347-2693},
pages = {701-705},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=3400},
doi = {https://doi.org/10.26438/ijcse/v6i12.701705}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v6i12.701705}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=3400
TI - A Comprehensive Study of Deep Learning Architectures, Applications and Tools
T2 - International Journal of Computer Sciences and Engineering
AU - Nilay Ganatra, Atul Patel
PY - 2018
DA - 2018/12/31
PB - IJCSE, Indore, INDIA
SP - 701-705
IS - 12
VL - 6
SN - 2347-2693
ER -

VIEWS PDF XML
501 237 downloads 186 downloads
  
  
           

Abstract

The Deep learning architectures fall into the widespread family of machine learning algorithms that are based on the model of artificial neural network. Rapid advancements in the technological field during the last decade have provided many new possibilities to collect and maintain large amount of data. Deep Learning is considered as prominent field to process, analyze and generate patterns from such large amount of data that used in various domains including medical diagnosis, precision agriculture, education, market analysis, natural language processing, recommendation systems and several others. Without any human intervention, deep learning models are capable to produce appropriate results, which are equivalent, sometime even more superior even than human. This paper discusses the background of deep learning and its architectures, deep learning applications developed or proposed by various researchers pertaining to different domains and various deep learning tools.

Key-Words / Index Term

deep learning, architectures, applications, tools

References

[1]M. Nielsen, "Neural networks and deep learning," 2017. [Online]. Available: http: //neuralnetworksanddeeplearning.com/.
[2]U. V. SurajitChaudhuri, " Anoverview of business intelligence technology," Communications of the ACM, vol. 54, no. 8, p. 88–98, 2011.
[3]A. P. Sanskruti Patel, "Deep Leaning Architectures and its Applications A Survey," INTERNATIONAL JOURNAL OF COMPUTER SCIENCES AND ENGINEERING , vol. 6, no. 6, pp. 1177-1183, 2018.
[4]I. S. a. G. E. H. A. Krizhevsky, "Imagenet classification with deep convolutional neural networks," Advances in neural information processing systems, p. 1097–1105, 2012.
[5]A. P. J. Y. W. J. C. C. D. M. A. Y. N. a. C. P. R. Socher, "Recursive deep models for semantic composition- ality over a sentiment treebank," in in Proceedings of the conference on empirical methods in natural language processing, Citeseer, 2013.
[6]Y. D. K. Y. Y. B. L. D. D. H.-. T. X. H. L. H. G. T. D. Y. a. G. Z. G. Mesnil, "Using recurrent neural networks for slot filling in spoken language understanding," IEEE/ACM Transactions on Audio, Speech, and Language Processing, vol. 23, no. 3, pp. 530-539, March, 2015.
[7]Q. V. Le, "Building high-level features using large scale unsupervised learning," in IEEE International Conference on Acoustics, Speech and Signal Processing, May-2013.
[8]G. E. H. a. T. J. S. S. E. F. G. E. H. a. T. J. S. S. E. Fahlman, "Massively parallel architectures for AI: Netl, thistle, and Boltzmann machines," Washington, DC, USA, 1983.
[9]T. J. S. a. D. H. A. G. E. Hinton, "Boltzmann machines: Constraint satisfaction networks that learn," Technical Report CMU-CS-84-119, CarnegieMellon University, Pittsburgh, PA, USA, 1984.
[10]D. b. n. S. 4. Geoffrey E. Hinton (2009), "Deep belief networks, Scholarpedia," 2009.
[11]H. &. C. Y. &. C. L. Wang, "A Vehicle Detection Algorithm Based on Deep Belief Network," The Scientific World Journal, vol. 10.1155/2014/647380, 2014.
[12]J. Schmidhuber, "Deep Learning," Scholarpedia.
[13]S. U. z. d. n. N. Hochreiter, " Diploma thesis," in German, 1991.
[14]Y. S. P. &. F. P. Bengio, " Learning long-term dependencies with gradient descent is difficult," IEEE Trans, vol. Neural Networks 5, p. 157–166 , 1994.
[15]S. &. L. B. &. Y. S. Min, "Deep Learning in Bioinformatics. Briefings in Bioinformatics.," 18. 10.1093/bib/bbw068, 2016.
[16]N. A. a. G. M, "A Review on Deep Convolutional Neural Networks," IEEE Xplore, pp. 0588-0592, 2017.
[17]I. S. a. G. E. H. A. Krizhevsky, "ImageNet classification with deep convolutional neural networks," in in Proc. NIPS, 2012.
[18]K. S. a. A. Zisserman, "Very deep convolutional networks for large-scale image recognition," CoRR, vol. vol. abs/1409.1556, Apr. 2015.
[19]C. S. e. al, "Going deeper with convolutions," in in Proc. CVPR, 2015.
[20]"4. Convolutional Neural Networks Tutorial in TensorFlow, retrieved from http://adventuresinmachinelearning.com/convolutional-neural-networks-tutorial-tensorflow/," May 2018. [Online].
[21]H. Z. W. G. H. J. N. a. W. Y. Z. Chen, "A streaming-based network monitoring and threat detection system," in IEEE 14th International Conference on Software EngineeringResearch, Management and Applications (SERA), 2016.
[22]T. S. L. a. D. Mumford, "Hierarchical Bayesian inference in the visual cortex," J. Opt. Soc. Am. A, vol. Vol. 20, July 2003.
[23]S. O. Y.-W. T. Geoffrey E. Hinton, "A Fast Learning Algorithm for Deep Belief Nets," Neural Computation, pp. Pages 1527-1554, July 2006.
[24]A. M. a. G. Hinton, "Three new graphical models for statistical language modelling," ICML `07 Proceedings of the 24th international conference on Machine learning, pp. 641-648, 20 - 24 June 2007.
[25]G. D. a. G. H. Abdel-rahman Mohamed, "Deep Belief Networks for phone recognition," in Proceedings of the NIPS Workshop on Deep Learning for Speech Recognition and Related Applications, 2009.
[26]V. N. a. G. E. Hinton, "Rectified linear units improve restricted boltzmann machines," ICML`10 Proceedings of the 27th International Conference on International Conference on Machine Learning, pp. 807-814 , June 2010.
[27]Y. T. a. C. Eliasmith, "Deep networks for robust visual recognition," in Proceedings of the 27th International Conference on Machine Learning (ICML-10), Haifa, Israel, 2010.
[28]A.-r. M. H. J. G. P. Ossama Abdel-Hamid, "Applying Convolutional Neural Networks concepts to hybrid NN-HMM model for speech recognition," in 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2012.
[29]K. H. X. Z. S. R. J. Sun, "Deep Residual Learning for Image Recognition," in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2015.
[30]A. H. a. m. David Silver, "Mastering the game of Go with deep neural networks and tree search," in Nature 529(7587):484-489, 2016.
[31]Y. P. J. Z. a. W. X. Y. Dong, "Learning to Read Chest X-Ray Images from 16000+ Examples Using CNN," in 2017 IEEE/ACM International Conference on Connected Health: Applications , Systems and Engineering Technologies (CHASE), Philadelphia, 2017.
[32]O. B. F. B. P. L. R. P. G. D.-. j. J. T. D. W.-F. a. Y. B. J. Bergstra, "Theano: A cpu and gpu math compiler in python," in 9th Python in Science Conf, 2010.
[33]E. S. J. D. S. K. J. L. R. G. S. G. a. T. D. Y. Jia, "Caffe: Convolutional architecture for fast feature embedding," in 22nd ACM interna- tional conference on Multimedia. ACM, 2014.
[34]K. K. a. C. F. R. Collobert, "Torch7: A matlab-like environment for machine learning," in in BigLearn, NIPS Workshop, no. EPFL-CONF-192376, 2011.