Open Access   Article Go Back

Enhancing Classification Performance with Subset and Feature Selection Schemes

Aniket G. Meshram1 , K. Rajeswari2 , V. Vaithiyanathan3

Section:Research Paper, Product Type: Journal Paper
Volume-3 , Issue-5 , Page no. 187-191, May-2015

Online published on May 30, 2015

Copyright © Aniket G. Meshram, K. Rajeswari , V. Vaithiyanathan . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Aniket G. Meshram, K. Rajeswari , V. Vaithiyanathan, “Enhancing Classification Performance with Subset and Feature Selection Schemes,” International Journal of Computer Sciences and Engineering, Vol.3, Issue.5, pp.187-191, 2015.

MLA Style Citation: Aniket G. Meshram, K. Rajeswari , V. Vaithiyanathan "Enhancing Classification Performance with Subset and Feature Selection Schemes." International Journal of Computer Sciences and Engineering 3.5 (2015): 187-191.

APA Style Citation: Aniket G. Meshram, K. Rajeswari , V. Vaithiyanathan, (2015). Enhancing Classification Performance with Subset and Feature Selection Schemes. International Journal of Computer Sciences and Engineering, 3(5), 187-191.

BibTex Style Citation:
@article{Meshram_2015,
author = {Aniket G. Meshram, K. Rajeswari , V. Vaithiyanathan},
title = {Enhancing Classification Performance with Subset and Feature Selection Schemes},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {5 2015},
volume = {3},
Issue = {5},
month = {5},
year = {2015},
issn = {2347-2693},
pages = {187-191},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=501},
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=501
TI - Enhancing Classification Performance with Subset and Feature Selection Schemes
T2 - International Journal of Computer Sciences and Engineering
AU - Aniket G. Meshram, K. Rajeswari , V. Vaithiyanathan
PY - 2015
DA - 2015/05/30
PB - IJCSE, Indore, INDIA
SP - 187-191
IS - 5
VL - 3
SN - 2347-2693
ER -

VIEWS PDF XML
2350 2295 downloads 2526 downloads
  
  
           

Abstract

Classification is one of the important steps in data mining for categorizing huge amount of data. Different classifiers are in use today for the classification of large data sets. Some classifiers have shown better performance than the others. Though these classifiers have proven better than others, there is still a chance for improvement in the classification process. This improvement can be considered in terms of selecting the important or rather the features that affect most in the classification process. Thus, here we focus on using a set of five attribute selection techniques applied on two classifiers to show how attribute selection affects classification performance. We compare and discuss two well-known classification schemes – MultiLayer Perceptron and Simple Logistics based on the application of these five attribute selection techniques. The aim of this study is to demonstrate and understand the behaviour of these classifiers once subjected to attribute selection schemes.

Key-Words / Index Term

MultiLayer Perceptron, Simple Logistics, Attribute Selection Techniques, Subset Evaluation, Weka

References

[1] Malay Mitra and R. K. Samanta, “Cardiac Arrhythmia Classification Using Neural Networks with Selected Features”, International Conference on Computational Intelligence: Modeling Techniques and Applications (CIMTA) (2013).
[2] K. Rajeswari, RohitGarud and V. Vaithiyanathan, “Improving Efficiency of Classification using PCA and Apriori based Attribute Selection Technique”, Research Journal of Applied Sciences, Engg. and Technology 6(24): 4681-4684, (2013).
[3] Mark Hall and Geoffrey Holmes, “Benchmarking Attribute Selection Techniques for Discrete Class Data Mining”, Department of Comp. Science, The University of Waikato 2002.
[4] Trilok Chand Sharma, Manoj Jain, “WEKA Approach for Comparative Study of Classification Algorithm”, International Journal of Advanced Research in Computer and Communication Engineering Vol. 2, Issue 4, April (2013).
[5] JasminaNovaković, PericaStrbac, DusanBulatović, “Toward Optimal Feature Selection Using Ranking Methods and Classification Algorithms”, Yugoslav Journal of Operations Research 21 (2011).
[6] B.Azhagusundari, Antony SelvadossThanamani, “Feature Selection based on Information Gain”, International Journal of Innovative Technology and Exploring Engineering (IJITEE) ISSN: 2278-3075, Volume-2, Issue-2, Jan (2013).
[7] JasminaNovakovic, “Using Information Gain Attribute Evaluation to Classify Sonar Targets”, 17th Telecommunications forum TELFOR (2009).
[8] PhayungMeesad, PudsadeeBoonrawd and VatineeNuipian, “A Chi-Square-Test for Word Importance Differentiation in Text Classification”, International Conference on Information and Electronics Engineering IPCSIT vol.6 (2011).
[9] S. K. Shevade and S. S. Keerthi, “A simple and efficient algorithm for gene selection using sparse logistic regression”, 10.1093/bioinformatics/btg308 pg. 2246–2253 Vol. 19 no. 17 (2003).
[10] Datasets available at the UCI Machine Learning Repository, https://archive.ics.uci.edu/ml/datasets.html.
[11] WEKA,an Open Source software freely available at http://www.cs.waikato.ac.nz/.
[12] AbdulhamitSubasi, Ergun Ercelebi, “Classification of EEG signals using neural network and logisticregression”, Computer Methods and Programs in Biomedicine 78, 87—99 (2005).