Open Access   Article Go Back

Obtaining New Facts from Knowledge Bases using Neural Tensor Networks

Sagarika Sahoo1 , Avani Jadeja2

Section:Research Paper, Product Type: Journal Paper
Volume-3 , Issue-5 , Page no. 129-132, May-2015

Online published on May 30, 2015

Copyright © Sagarika Sahoo , Avani Jadeja . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Sagarika Sahoo , Avani Jadeja, “Obtaining New Facts from Knowledge Bases using Neural Tensor Networks,” International Journal of Computer Sciences and Engineering, Vol.3, Issue.5, pp.129-132, 2015.

MLA Style Citation: Sagarika Sahoo , Avani Jadeja "Obtaining New Facts from Knowledge Bases using Neural Tensor Networks." International Journal of Computer Sciences and Engineering 3.5 (2015): 129-132.

APA Style Citation: Sagarika Sahoo , Avani Jadeja, (2015). Obtaining New Facts from Knowledge Bases using Neural Tensor Networks. International Journal of Computer Sciences and Engineering, 3(5), 129-132.

BibTex Style Citation:
@article{Sahoo_2015,
author = {Sagarika Sahoo , Avani Jadeja},
title = {Obtaining New Facts from Knowledge Bases using Neural Tensor Networks},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {5 2015},
volume = {3},
Issue = {5},
month = {5},
year = {2015},
issn = {2347-2693},
pages = {129-132},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=492},
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=492
TI - Obtaining New Facts from Knowledge Bases using Neural Tensor Networks
T2 - International Journal of Computer Sciences and Engineering
AU - Sagarika Sahoo , Avani Jadeja
PY - 2015
DA - 2015/05/30
PB - IJCSE, Indore, INDIA
SP - 129-132
IS - 5
VL - 3
SN - 2347-2693
ER -

VIEWS PDF XML
2518 2303 downloads 2546 downloads
  
  
           

Abstract

Knowledge bases are an important resource for easily accessible, systematic relational knowledge. They provide applications with the benefit of question answering and other tasks but often suffer from their incompleteness, lack of knowledge and ability to purpose over new entities and relations. Much work has done to build and extend the relationship in knowledge base. This paper mainly focuses on completing a knowledge base by reasoning over entities relationship with Neural Tensor Network (NTN). New relationships can be predicted with Neural Tensor Network that can be added to the database. We make evident that the model is improved by making entities represented with vectors learned from unsupervised large corpora.

Key-Words / Index Term

Neural Tensor Network, Knowledge Base, Entity Vector, Bilinear Tensors, Unsupervised Text

References

[1] G.A. Miller. WordNet: A Lexical Database for English. Communications of the ACM, 1995.
[2] F.M. Suchanek, G. Kasneci, and G.Weikum. Yago: a core of semantic knowledge. In Proceedings of the 16th international conference on World Wide Web, 2007.
[3] J. Graupmann, R. Schenkel, and G. Weikum. The SphereSearch engine for unified ranked retrieval of heterogeneous XML and web documents. In Proceedings of the 31st international conference on Very large data bases, VLDB, 2005.
[4] Ng and C. Cardie. Improving machine learning approaches to coreference resolution. In ACL, 2002.
[5] R. Snow, D. Jurafsky, and A. Y. Ng. Learning syntactic patterns for automatic hypernym discovery. In NIPS, 2005.
[6] Fader, S. Soderland, and O. Etzioni. Identifying relations for open information extraction. In EMNLP, 2011.
[7] J. Turian, L. Ratinov, and Y. Bengio. Word representations: a simple and general method for semisupervised learning. In Proceedings of ACL, pages 384–394, 2010.
[8] Bordes, X. Glorot, J. Weston, and Y. Bengio. Joint Learning of Words and Meaning Representation for Open-Text Semantic Parsing. AISTATS, 2012.
[9] Bordes, J. Weston, R. Collobert, and Y. Bengio. Learning structured embeddings of knowledge bases. In AAAI, 2011.
[10] Sutskever, R. Salakhutdinov, and J. B. Tenenbaum. Modelling relational data using Bayesian clustered tensor factorization. In NIPS, 2009.
[11] R. Socher, B. Huval, C. D. Manning, and A. Y. Ng. Semantic Compositionality Through Recursive Matrix-Vector Spaces. In EMNLP, 2012.
[12] M. Ranzato and A. Krizhevsky G. E. Hinton. Factored 3-Way Restricted Boltzmann Machines For Modeling Natural Images. AISTATS, 2010.
[13] R. Collobert and J. Weston. A unified architecture for natural language processing: deep neural networks with multitask learning. In ICML, 2008.
[14] Y. Bengio, R. Ducharme, P. Vincent, and C. Janvin. A neural probabilistic language model. J. Mach. Learn. Res., 3, March 2003.
[15] H. Huang, R. Socher, C. D. Manning, and A. Y. Ng. Improving Word Representations via Global Context and Multiple Word Prototypes. In ACL, 2012.
[16] Richard Socher, Danqi Chen, Christopher D. Manning, and Andrew Y. Ng. 2013. Reasoning With Neural Tensor Networks For Knowledge Base Completion. In Advances in Neural Information Processing Systems 26.
[17] R. Socher, B. Huval, C. D. Manning, and A. Y. Ng. Semantic Compositionality Through Recursive Matrix-Vector Spaces. In EMNLP, 2012.
[18] Richard Socher, Recursive Deep Learning for Natural Language Processing and Computer Vision, 2014.
[19] R. Collobert and J.Weston. A unified architecture for natural language processing: deep neural networks with multitask learning. In ICML, 2008.
[20] R. Socher, E. H. Huang, J. Pennington, A. Y. Ng, and C. D. Manning. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection. In NIPS. MIT Press, 2011.
[21] E. H. Huang, R. Socher, C. D. Manning, and A. Y. Ng. Improving Word Representations via Global Context and Multiple Word Prototypes. In ACL, 2012.
[22] Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank, Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Chris Manning, Andrew Ng and Chris Potts. Conference on Empirical Methods in Natural Language Processing (EMNLP 2013, Oral).
[23] L. Deng and D. Yu, "Deep Learning: Methods and Applications "http://research.microsoft.com/pubs/209355/DeepLearning-NowPublishing-Vol7-SIG-039.pdf”
[24] Y. Bengio, A. Courville, and P. Vincent., "Representation Learning: A Review and New Perspectives," IEEE Trans. PAMI, special issue Learning Deep Architectures, 2013.