Back to Search Start Over

Helping Hearing-Impaired in Emergency Situations: A Deep Learning-Based Approach

Authors :
Qazi Mohammad Areeb
Maryam
Mohammad Nadeem
Roobaea Alroobaea
Faisal Anwer
Source :
IEEE Access, Vol 10, Pp 8502-8517 (2022)
Publication Year :
2022
Publisher :
IEEE, 2022.

Abstract

Hearing-impaired people use sign language to express their thoughts and emotions and reinforce information delivered in daily conversations. Though they make a significant percentage of any population, the majority of people can’t interact with them due to limited or no knowledge of sign languages. Sign language recognition aims to detect the significant motions of the human body, especially hands, analyze them and understand them. Such systems may become life-saving when hearing-challenged people are in desperate situations like heart attacks, accidents, etc. In the present study, deep learning-based hand gesture recognition models are developed to accurately predict the emergency signs of Indian Sign Language (ISL). The dataset used contains the videos for eight different emergency situations. Several frames were extracted from the videos and are fed to three different models. Two models are designed for classification, while one is an object detection model, applied after annotating the frames. The first model consists of a three-dimensional convolutional neural network (3D CNN), while the second comprises of a pre-trained VGG-16 and a recurrent neural network with a long short-term memory (RNN-LSTM) scheme. The last model is based on YOLO (You Only Look Once) v5, an advanced object detection algorithm. The prediction accuracies of the classification models were 82% and 98%, respectively. YOLO based model outperformed the rest and achieved an impressive mean average precision of 99.6%.

Details

Language :
English
ISSN :
21693536
Volume :
10
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.3b1dcd78df65482facfec34c25935e41
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2022.3142918