Back to Search
Start Over
A CASE STUDY OF IMPROVING ENGLISH-ARABIC TRANSLATION USING THE TRANSFORMER MODEL.
- Source :
- International Journal of Intelligent Computing & Information Sciences; Jun2023, Vol. 23 Issue 2, p105-115, 11p
- Publication Year :
- 2023
-
Abstract
- Arabic is a language with rich morphology and few resources. Arabic is therefore recognized as one of the most challenging languages for machine translation. The study of translation into Arabic has received significantly less attention than that of European languages. Consequently, further research into Arabic machine translation quality needs more investigation. This paper proposes a translation model between Arabic and English based on Neural Machine Translation (NMT). The proposed model employs a transformer with multi-head attention. It combines a feed-forward network with a multi-head attention mechanism. The NMT proposed model has demonstrated its effectiveness in improving translation by achieving an impressive accuracy of 97.68%, a loss of 0.0778, and a near-perfect Bilingual Evaluation Understudy (BLEU) score of 99.95. Future work will focus on exploring more effective ways of addressing the evaluation and quality estimation of NMT for low-data resource languages, which are often challenging as a result of the scarcity of reference translations and human annotators. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1687109X
- Volume :
- 23
- Issue :
- 2
- Database :
- Complementary Index
- Journal :
- International Journal of Intelligent Computing & Information Sciences
- Publication Type :
- Academic Journal
- Accession number :
- 169746161
- Full Text :
- https://doi.org/10.21608/ijicis.2023.210435.1270