Back to Search Start Over

Towards audio-based identification of Ethio-Semitic languages using recurrent neural network.

Authors :
Alemu AA
Melese MD
Salau AO
Source :
Scientific reports [Sci Rep] 2023 Nov 07; Vol. 13 (1), pp. 19346. Date of Electronic Publication: 2023 Nov 07.
Publication Year :
2023

Abstract

In recent times, there is an increasing interest in employing technology to process natural language with the aim of providing information that can benefit society. Language identification refers to the process of detecting which speech a speaker appears to be using. This paper presents an audio-based Ethio-semitic language identification system using Recurrent Neural Network. Identifying the features that can accurately differentiate between various languages is a difficult task because of the very high similarity between characters of each language. Recurrent Neural Network (RNN) was used in this paper in relation to the Mel-frequency cepstral coefficients (MFCCs) features to bring out the key features which helps provide good results. The primary goal of this research is to find the best model for the identification of Ethio-semitic languages such as Amharic, Geez, Guragigna, and Tigrigna. The models were tested using an 8-h collection of audio recording. Experiments were carried out using our unique dataset with an extended version of RNN, Long Short Term Memory (LSTM) and Bidirectional Long Short Term Memory (BLSTM), for 5 and 10 s, respectively. According to the results, Bidirectional Long Short Term Memory (BLSTM) with a 5 s delay outperformed Long Short Term Memory (LSTM). The BLSTM model achieved average results of 98.1, 92.9, and 89.9% for training, validation, and testing accuracy, respectively. As a result, we can infer that the best performing method for the selected Ethio-Semitic language dataset was the BLSTM algorithm with MFCCs feature running for 5 s.<br /> (© 2023. The Author(s).)

Details

Language :
English
ISSN :
2045-2322
Volume :
13
Issue :
1
Database :
MEDLINE
Journal :
Scientific reports
Publication Type :
Academic Journal
Accession number :
37935777
Full Text :
https://doi.org/10.1038/s41598-023-46646-3