1. An Adaptive System of Yogic Gesture Recognition for Human Computer Interaction
- Author
-
Priyanka Choudhary and S. N. Tazi
- Subjects
Set (abstract data type) ,Background subtraction ,Computer science ,Gesture recognition ,business.industry ,Adaptive system ,Interface (computing) ,Computer vision ,Artificial intelligence ,business ,Convolutional neural network ,Test data ,Gesture - Abstract
The purpose of the research is to validate the potential of Yogic Hand Gestures in a well-formed human-computer interface with a real-time image sequence taken on a video recording device to trace the potential subject region(PSR) spontaneously, essentially, the hand region with the help of skin detection algorithm, and detect, and perceive hand gestures for human-computer interaction. To detect skin, we use skin colour detection and softening to remove extra background information from the image, and then use background subtraction to detect the PSR. Moreover, to avoid the background information, we use the kernelised correlation filters (KCF) algorithm to track the detected PSR. The image size of the PSR is then resized to 50px * 50px and then fed into the deep convolutional neural network (CNN) to identify eight yogic hand gestures. The deep CNN architecture developed in this study that is a modified VGGNet. The above process of tracking and recognition is repeated with a ranking algorithm to produce a real-time impression, and the system’s execution continues until the hand leaves the camera range. While recognising the gesture primarily, it adds the top-ranked image captures to add into the sample pool for future training, the training data set reaches a recognition rate of 99.00%, and the test data set has a recognition rate of 95.89%, which represents the feasibility of the practical application. The implemented proof of concept and the custom yogic gesture dataset, namely the YoGiR-1 dataset, are availed on request.
- Published
- 2020
- Full Text
- View/download PDF