International Journal of
Computer Sciences and Engineering

Scholarly Peer-Reviewed, and Fully Refereed Scientific Research Journal
Analysis of Speed, Accuracy of Deep Learning using Gini index, HSM based fuzzy decision trees
Analysis of Speed, Accuracy of Deep Learning using Gini index, HSM based fuzzy decision trees
S.V.G.Reddy 1 , K.Thammi Reddy2 , V.Valli Kumari3
1 Dept. of CSE, GIT (GITAM University), Visakhapatna, India.
2 Dept. of CSE, GIT (GITAM University), Visakhapatna, India.
3 Dept. of CS and SE, College of Engineering (Andhra University), Visakhapatnam, India.
Correspondence should be addressed to: venkat157.reddy@gmail.com.

Section:Research Paper, Product Type: Journal Paper
Volume-5 , Issue-11 , Page no. 15-23, Nov-2017

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v5i11.1523

Online published on Nov 30, 2017

Copyright © S.V.G.Reddy, K.Thammi Reddy, V.Valli Kumari . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
 
View this paper at   Google Scholar | DPI Digital Library
  XML View PDF Download  
Citation

IEEE Style Citation: S.V.G.Reddy, K.Thammi Reddy, V.Valli Kumari, “Analysis of Speed, Accuracy of Deep Learning using Gini index, HSM based fuzzy decision trees”, International Journal of Computer Sciences and Engineering, Vol.5, Issue.11, pp.15-23, 2017.

MLA Style Citation: S.V.G.Reddy, K.Thammi Reddy, V.Valli Kumari "Analysis of Speed, Accuracy of Deep Learning using Gini index, HSM based fuzzy decision trees." International Journal of Computer Sciences and Engineering 5.11 (2017): 15-23.

APA Style Citation: S.V.G.Reddy, K.Thammi Reddy, V.Valli Kumari, (2017). Analysis of Speed, Accuracy of Deep Learning using Gini index, HSM based fuzzy decision trees. International Journal of Computer Sciences and Engineering, 5(11), 15-23.
Downloads (114)     Full view (114)
           
Abstract :
Deep Learning has gained tremendous importance due to its advancement in various fields of text mining, speech recognition, computer vision, natural language processing etc. The weights of the input layer attributes and the series of hidden layers of deep learning plays a dominant role in its classification speed and accuracy. The weight adjustment algorithm for the Deep Learning is proposed in this paper. Generally, the weights can be determined by mathematical techniques, can be suggested by the domain experts or by considering random weights. In this proposed work, the weights of a neural network are computed mathematically by constructing the fuzzy decision tree. It is proposed to use the maximum heterogeneous split measure(HSM) value of the attribute of the fuzzy decision tree as the weight of the corresponding attribute for the weight adjustment algorithm to classify using neural networks. Fast classification and accuracy is achieved with the computed HSM weights of the deep learning which outperforms when compared with the fuzzy decision tree classifiers. The same work was carried out using the least value of gini index. And in this paper the classification speed, accuracy is compared by considering the gini index and HSM based fuzzy decision trees and analyzed the results.
Key-Words / Index Term :
Deep Learning, Heterogeneous split measure, gini index, weight, fuzzy, Decision trees, Classification Accuracy
References :
[1]Alsabti, K., Ranka, S., & Singh, V. (1998). CLOUDS: A decision tree classifier for largedatasets. In Knowledge discovery and data mining (p 2–8),
[2]B.chandra, Venkata Nareshbabu Kuppili, Heterogeneous node split measure for decision tree construction, IEEE 2011, p872-877
[3]Chandra, B., & Paul, P. (2007). On improving the efficiency of SLIQ decision tree algorithm. In Proceedings of IEEE international joint conference on neural networks, IJCNN – 2007,
[4]Manish Mehta, Rakesh Agrawal and Jorma Rissanen. SLIQ: A Fast Scalable Classifier for Data Mining, Lecture Notes in Computer Science, vol 1057. Springer, Berlin, Heidelberg,
[5] Chengming, Q. (2007). A new partition criterion or fuzzy decision tree algorithm. In Intelligent information technology application, workshop on 2–3 December 2007 (p 43–46),
[6]Adamo, J. M. (1980). Fuzzy decision trees. Fuzzy Sets and Systems, vol 4,issue 3, 1980,p 207–219,
[7]Cristina, O., & Wehenkel, L. (2003). A complete fuzzy decision tree technique. Fuzzy Sets and Systems, vol 138, 2003, p 221–254,
[8]Prakash bethapudi, E. Sreenivas Reddy, Kamadi VSRP Varma, Gaussian fuzzy HSM algorithm for classification of BIRADS data set using Decision Tree, IJCTA, 2015,
[9]Shalini Bhaskar Bajaj, Akshaya Kubba, “FHSM : Fuzzy heterogeneous split algorithm for decision trees, IEEE 2014.
[10] Breiman, L., Friedman, J. H., Olshen, J. A., & Stone, C. J. (1984). Classification and regression trees. Belmont, CA: Wadsworth International Group,
[11] Chandra, B., Mazumdar, S., Arena, V., & Parimi, N. (2002). Elegant decision tree algorithms for classification in data mining. In Proceedings of the 3rd international conference on information systems engineering (workshops), IEEE ,CS (pp. 160–169),
[12] Chandra, B., & Paul, P. (2006). In Robust algorithm for classification using decision trees CIS-RAM 2006 (p. 608–612). IEEE,
[13]Jόrgen Schmidhuber, Deep Learning in neural networks: An Overview, Elsevier(Neural networks), 2015, vol 61, p 85-117,
[14]Victoria j hodge, simon o keefe, jim Austin, Hadoop neural network for parallel and distributed feature selection, Elsevier(Neural networks), 2016, vol 78, p 24-35,
[15]Jihun kim, jong hong kim, gil-jin jang, minho lee, fast learning method for convolutional neural networks using extreme learning machine and its application to lane detection,Elsevier(Neuralnetworks),2017,vol87,p109-121
[16]M.W.Gardner, S.R.Dorling, Artificial neural networks (the multilayer perceptron)—a review of applications in the atmospheric sciences, Elsevier, Atmospheric environment, 1998, pp 2627 – 2636,
[17]Bing Cheng, D.M.Titterington, Neural Networks: A Review from a Statistical Perspective. Vol 9, no 1, 1994, p2-54.