Back to Search Start Over

A Continuous Space Neural Language Model for Bengali Language

Authors :
Chowdhury, Hemayet Ahmed
Imon, Md. Azizul Haque
Rahman, Anisur
Khatun, Aisha
Islam, Md. Saiful
Chowdhury, Hemayet Ahmed
Imon, Md. Azizul Haque
Rahman, Anisur
Khatun, Aisha
Islam, Md. Saiful
Publication Year :
2020

Abstract

Language models are generally employed to estimate the probability distribution of various linguistic units, making them one of the fundamental parts of natural language processing. Applications of language models include a wide spectrum of tasks such as text summarization, translation and classification. For a low resource language like Bengali, the research in this area so far can be considered to be narrow at the very least, with some traditional count based models being proposed. This paper attempts to address the issue and proposes a continuous-space neural language model, or more specifically an ASGD weight dropped LSTM language model, along with techniques to efficiently train it for Bengali Language. The performance analysis with some currently existing count based models illustrated in this paper also shows that the proposed architecture outperforms its counterparts by achieving an inference perplexity as low as 51.2 on the held out data set for Bengali.<br />Comment: 6 pages

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1228386042
Document Type :
Electronic Resource