Open Access   Article Go Back

Designing a Classifier Using Unsupervised Learning and Rough Set Theory

Vairaprakash Gurusamy1 , K. Nandhini2

  1. Department of Computer Applications, School of IT, Madurai Kamaraj University, Madurai, India.
  2. Technical Support Engineer, Concentrix India Pvt Ltd, Chennai, India.

Correspondence should be addressed to: vairaprakashmca@gmail.com .

Section:Research Paper, Product Type: Journal Paper
Volume-5 , Issue-10 , Page no. 226-230, Oct-2017

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v5i10.226230

Online published on Oct 30, 2017

Copyright © Vairaprakash Gurusamy, K. Nandhini . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Vairaprakash Gurusamy, K. Nandhini, “Designing a Classifier Using Unsupervised Learning and Rough Set Theory,” International Journal of Computer Sciences and Engineering, Vol.5, Issue.10, pp.226-230, 2017.

MLA Style Citation: Vairaprakash Gurusamy, K. Nandhini "Designing a Classifier Using Unsupervised Learning and Rough Set Theory." International Journal of Computer Sciences and Engineering 5.10 (2017): 226-230.

APA Style Citation: Vairaprakash Gurusamy, K. Nandhini, (2017). Designing a Classifier Using Unsupervised Learning and Rough Set Theory. International Journal of Computer Sciences and Engineering, 5(10), 226-230.

BibTex Style Citation:
@article{Gurusamy_2017,
author = {Vairaprakash Gurusamy, K. Nandhini},
title = {Designing a Classifier Using Unsupervised Learning and Rough Set Theory},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {10 2017},
volume = {5},
Issue = {10},
month = {10},
year = {2017},
issn = {2347-2693},
pages = {226-230},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=1502},
doi = {https://doi.org/10.26438/ijcse/v5i10.226230}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v5i10.226230}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=1502
TI - Designing a Classifier Using Unsupervised Learning and Rough Set Theory
T2 - International Journal of Computer Sciences and Engineering
AU - Vairaprakash Gurusamy, K. Nandhini
PY - 2017
DA - 2017/10/30
PB - IJCSE, Indore, INDIA
SP - 226-230
IS - 10
VL - 5
SN - 2347-2693
ER -

VIEWS PDF XML
560 288 downloads 212 downloads
  
  
           

Abstract

Dataset collected from multiple sources is often inconsistent and generates different label of decisions for the same conditional attribute values. A method for handling inconsistency has been proposed here using Kohonen Self organizing neural network, an unsupervised learning approach. After removing inconsistency, the minimum subset of attributes in the dataset called reducts are selected using Rough Set Theory, which effectively reduces dimensionality of the dataset. Unlike most of the existing reduct generation algorithms where all attributes are examined, here evaluation of all attributes is not required and therefore, time complexity has been improved considerably. In the next step, considering core attribute as root node of a decision tree, all possible rules are generated which are pruned based on information entropy and coverage of the rule set. The classifier is built using the reduced rule set demonstrating comparable results with the classifier consisting of all attributes.

Key-Words / Index Term

Inconsistency, Rough Set, Unsupervised Neural Network

References

[1]. Zdzislaw Pawlak, “Rough sets”, International Journal of Computer and Information Sciences, 11, 341-356, 1982.
[2]. I. Düntsch, and G. Gediga, “Algebraic aspects of attribute dependencies in information systems”, Fundamental Informaticae, Vol. 29, 1997, pp. 119-133.
[3]. A. Øhrn, “Discernibility and rough sets in medicine: tools and applications”, PhD thesis, Department of Computer and information science, Norwegian University of Science and Technology, 1999.
[4]. J.G. Bazan, M.S. Szczuka, and J. Wroblewski, “A new version of rough set exploration system,” Lecture notes in artificial intelligence, Vol.2475, 2002, pp. 397-404.
[5]. N.Ttow, D.R. Morse, and D.M. Roberts, “Rough set approximation as formal concept,” Journal of advanced computational intelligence and intelligent informatics, Vol.10, No.5, 2006, pp. 606-611.
[6]. P.Guo, and H. Tanaka, “Upper and lower possibility distributions with rough set concepts,”, In Rough set theory and granular computing, Springer, 2002, pp. 243-250.
[7]. Kohonen, T. “Self-Organizing Maps”, 3rd edition, Berlin: Springer-Verlag, 2001.
[8]. Zhangyan Xu, Liyu Huang, Wenbin Qian, Bingru Yang, “Quick Attribute Reduction Algorithm Based on Improved Frequent Pattern Tree”.
[9]. Ganesan G, Raghavendra Rao C., Latha D., “An overview of rough sets”, proceedings of the National Conference on the emerging trends in Pure and Applied Mathematics, Palayamkottai, India, pp: 70-76, 2005
[10]. [Han, 2001] Han J and Kamber M, “Data Mining: Concepts and Techniques”, Morgan Kaufmann, 2001, 279-325.
[11]. C.R.Rao and P.V.Kumar. “Functional Dependencies through Val”,
[12]. ICCMSC ’99, India, TMH publications 116-123, 1999.
[13]. Pawlak,1991] Zdzislaw Pawlak, “RoughSets- Theoretical Aspects and Reasoning about Data”, Kluwer Academic Publications, 1991
[14]. Quinlan J.R. “Induction of decision trees”. Machine Learning 181-106.
[15]. Ramadevi Y, C.R.Rao, “Knowledge Extraction Using Rough Sets –Gpcr – Classification”,International conference on Bioinformatics and diabetes mellitus, India, 2006.
[16]. [ [Starzyk, 1999] Starzyk J, Nelson D.E., SturtzK, “Reduct Generation in Information Systems”,Bulletin of International Rough Set Society, 3(1/2), 1999.