1. Bangla Sign Language recognition using convolutional neural network
- Author
-
Sasikumaran Sreedharan, P. W. C. Prasad, Farhad Yasir, Amr Elchouemi, and Abeer Alsadoon
- Subjects
business.industry ,Computer science ,Feature extraction ,Pattern recognition ,02 engineering and technology ,Sign language ,01 natural sciences ,Convolutional neural network ,Gesture recognition ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,Feature (machine learning) ,020201 artificial intelligence & image processing ,Artificial intelligence ,010306 general physics ,business ,Hidden Markov model ,Sign (mathematics) ,Gesture - Abstract
This paper presents a learning based approach to Bangla Sign Language(BdSL) recognition using the convolutional neural network. In our proposed method, a virtual reality-based hand tracking controller known as Leap motion controller (LMC) has introduced to track the continuous motion of the hands. LMC provides a skeletal model of the hand with appropriate data of hand position, orientation, rotation, fingertips, grabbing and more non-linear features. This controller preprocessed all the motion features and provides error free data. This machine calibrates with the environment and builds a virtual hand in a space. LMC also calculates the rotation, orientation, and textures from hands to determine and to extract hand gesture. In the next process, an efficient method is established to proceed a sequence of frames for positional hand gestures and summarize them to a shorter and more generalized sequence of lines and curves which are added to a Hidden Markov Model. For each sign of expression, we considered a start and an end point of state and segmented the state transitions into segmented HMM. In the segmentation, we assumed the state scope of the hidden variables is discrete. The transition probabilities controlled the way of hidden state at a distinct time. If there is a histogram difference in any state, the transition state moved to new frame to achieve a new sign expression. If there is no hand gesture in the frame, the state has ended by moving to the end point of the model. In the end point, we evaluated the desired hand gesture for recognition. After evaluation, hand gesture data set are proceeded over the convolutional neural network (CNN) and built a decision network. Each neuron is built up by calculating the dot product of extracted features in the dataset. In CNN, a single vector of hand gesture data is received and connected through a series of hidden layers and in the end point computed as a single vector loss function. Each feature is considered as a hidden layer. Determining the least loss function, the network recognizes the expected sign expression. In our experiment, we considered training data first to create the neurons in our network as a supervised way. We achieved significant results from our basic sign expressions in a 3% rate of error where without distortion the rate reduced to 2%. This is an enormous achievement in the Bangla sign language recognition method.
- Published
- 2017