Back to Search Start Over

Real-Time Human-UAV Interaction: New Dataset and Two Novel Gesture-Based Interacting Systems

Authors :
Mohamed A. Kassab
Mostafa Ahmed
Ali Maher
Baochang Zhang
Source :
IEEE Access, Vol 8, Pp 195030-195045 (2020)
Publication Year :
2020
Publisher :
IEEE, 2020.

Abstract

Two novel gesture-based Human-UAV Interaction (HUI) systems are proposed to launch and control a UAV in real-time utilizing a monocular camera and a ground computer. The first proposal is an end-to-end static Gesture-Based Interaction (GBI) system based on classifying the interacted user poses while discarding the gesture interpreting component to boost the system performance up to 99 % with a speed of 28 fps. On the other hand, the second proposal is a dynamic one that adopts a simple model to detect three parts (face and two hands) of the interacted person and tracks them for a certain number of frames till a specific dynamic gesture is recognized. The proposed dynamic method is efficient, decreases the complexity, and speeds the interaction up to 27 fps comparing with the recent multimodel ones. Its backbone is a simplified Tiny-You Only Look Once (YOLO) network saves the resources and speeds the detection process up to 120 fps. Moreover, a comprehensive new gestures dataset was established to facilitate the learning process and aid the research work. A comparative study is carried out to show the performance and efficiency of the proposed dynamic HUI system in terms of detection accuracy and speed with the baseline detector on the human gesture public dataset. Finally, a non-expert volunteer examines the proposed HUIs by launching and driving a Bebop 2 micro UAV through a set of real flights.

Details

Language :
English
ISSN :
21693536
Volume :
8
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.9ce2479f0d247cb96ee74c7214da212
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2020.3033157