Back to Search
Start Over
A Subword Level Language Model for Bangla Language
- Source :
- Proceedings of International Joint Conference on Computational Intelligence ISBN: 9789811536069, IJCCI
- Publication Year :
- 2020
- Publisher :
- Springer Singapore, 2020.
-
Abstract
- Language models are at the core of natural language processing. The ability to represent natural language gives rise to its applications in numerous NLP tasks including text classification, summarization, and translation. Research in this area is very limited in Bangla due to the scarcity of resources, except for some count-based models and very recent neural language models being proposed, which are all based on words and limited in practical tasks due to their high perplexity. This paper attempts to approach this issue of perplexity and proposes a subword level neural language model with the AWD-LSTM architecture and various other techniques suitable for training in Bangla language. The model is trained on a corpus of Bangla newspaper articles of an appreciable size consisting of more than 28.5 million word tokens. The performance in comparison with various other models depicts the significant reduction in perplexity the proposed model provides, reaching as low as 39.84, in just 20 epochs.
- Subjects :
- Perplexity
Computer science
business.industry
Deep learning
computer.software_genre
Automatic summarization
language.human_language
Reduction (complexity)
ComputingMethodologies_PATTERNRECOGNITION
Bengali
language
Language model
Artificial intelligence
business
computer
Word (computer architecture)
Natural language processing
Natural language
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of International Joint Conference on Computational Intelligence ISBN: 9789811536069, IJCCI
- Accession number :
- edsair.doi...........2ea794da29768783ca312c8c93f3208f
- Full Text :
- https://doi.org/10.1007/978-981-15-3607-6_31