Simplification with the Transformer - Its Drawbacks
K. Mehta1 , H. Chodvadiya2 , S.R. Sankhe3
Section:Research Paper, Product Type: Journal Paper
Volume-8 ,
Issue-6 , Page no. 1-5, Jun-2020
CrossRef-DOI: https://doi.org/10.26438/ijcse/v8i6.15
Online published on Jun 30, 2020
Copyright © K. Mehta, H. Chodvadiya, S.R. Sankhe . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
View this paper at Google Scholar | DPI Digital Library
How to Cite this Paper
- IEEE Citation
- MLA Citation
- APA Citation
- BibTex Citation
- RIS Citation
IEEE Style Citation: K. Mehta, H. Chodvadiya, S.R. Sankhe, “Simplification with the Transformer - Its Drawbacks,” International Journal of Computer Sciences and Engineering, Vol.8, Issue.6, pp.1-5, 2020.
MLA Style Citation: K. Mehta, H. Chodvadiya, S.R. Sankhe "Simplification with the Transformer - Its Drawbacks." International Journal of Computer Sciences and Engineering 8.6 (2020): 1-5.
APA Style Citation: K. Mehta, H. Chodvadiya, S.R. Sankhe, (2020). Simplification with the Transformer - Its Drawbacks. International Journal of Computer Sciences and Engineering, 8(6), 1-5.
BibTex Style Citation:
@article{Mehta_2020,
author = {K. Mehta, H. Chodvadiya, S.R. Sankhe},
title = {Simplification with the Transformer - Its Drawbacks},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {6 2020},
volume = {8},
Issue = {6},
month = {6},
year = {2020},
issn = {2347-2693},
pages = {1-5},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=5136},
doi = {https://doi.org/10.26438/ijcse/v8i6.15}
publisher = {IJCSE, Indore, INDIA},
}
RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v8i6.15}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=5136
TI - Simplification with the Transformer - Its Drawbacks
T2 - International Journal of Computer Sciences and Engineering
AU - K. Mehta, H. Chodvadiya, S.R. Sankhe
PY - 2020
DA - 2020/06/30
PB - IJCSE, Indore, INDIA
SP - 1-5
IS - 6
VL - 8
SN - 2347-2693
ER -
VIEWS | XML | |
640 | 855 downloads | 246 downloads |
Abstract
Natural Language Processing is an active and emerging field of research in the computer sciences. Within it is the subfield of text simplification which is aimed towards teaching the computer the so far primarily manual task of simplifying text, efficiently. While handcrafted systems using syntactic techniques were the first simplification systems, Recurrent Neural Networks and Long Short Term Memory networks employed in seq2seq models with attention were considered state-of-the-art until very recently when the transformer architecture which did away with the computational problems that plagued them. This paper presents our work on simplification using the transformer architecture in the process of making an end-to-end simplification system for linguistically complex reference books written in English and our findings on the drawbacks/limitations of the transformer during the same. We call these drawbacks as the Fact Illusion Induction, Named Entity Problem and Deep Network Problem and try to theorize the possible reasons for them.
Key-Words / Index Term
Artificial Intelligence, Natural Language Processing, Neural Networks, Text Simplification, Transformer
References
[1] Y. Goldberg, G. Hirst, ?Neural Network Methods in Natural Language Processing?, Morgan & Claypool Publishers, USA, 2017. ISBN no. 9781627052955
[2] A. Siddharthan, ?A Survey of Research on Text Simplification?, ITL - International Journal of Applied Linguistics, Vol.165, No.2, pp.259-298, 2014.
[3] M. Shardlow, ?A Survey of Automated Text Simplification.?, International Journal of Advanced Computer Science and Applications, Vol.4, No.1, pp. 58?70, 2014.
[4] S. Wubben, E. Krahmer, A. van den Bosch, ?Sentence Simplification by Monolingual Machine Translation?, In the Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics, Republic of Korea, pp.1015-1024, 2012.
[5] T. Vu, B. Hu, T. Munkhdalai, H. Yu, ?Sentence Simplification with Memory-augmented Neural Networks?, In the Proceedings of the NAACL-HLT, USA, pp.79-85, 2018.
[6] W. Coster, D. Kauchak, ?Simple English Wikipedia: A New Text Simplification Task.?, In the Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, USA, pp.665-669, 2011.
[7] Z. Zhu, D. Bernhard, I. Gureych, ?A Monolingual Tree-based Translation Model for Sentence Simplification?, In the Proceedings of the 23rd International Conference on Computational Linguistics, China, pp.1353-1361, 2010.
[8] S. Nisioi, S. Stajner, S. P. Ponzetto, L. P. Dinu, ?Exploring Neural Text Simplification Models?, In the Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, Canada, pp.85-91, 2017.
[9] A. Vaswani et al, ?Attention is all you Need?, In the Proceedings of the 31st Conference on Neural Information Processing Systems, USA, pp.5998-6008, 2017.
[10] S. Zhao, R. Meng, D. He, S. Andi, P. Bamabang, ?Integrating Transformer and Paraphrase Rules for Sentence Simplification?, In the Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Belgium, pp.3164-3173, 2018.