Back to Search Start Over

Depth Estimation for Egocentric Rehabilitation Monitoring Using Deep Learning Algorithms

Authors :
Yasaman Izadmehr
Héctor F. Satizábal
Kamiar Aminian
Andres Perez-Uribe
Source :
Applied Sciences; Volume 12; Issue 13; Pages: 6578
Publisher :
MDPI

Abstract

Upper limb impairment is one of the most common problems for people with neurological disabilities, affecting their activity, quality of life (QOL), and independence. Objective assessment of upper limb performance is a promising way to help patients with neurological upper limb disorders. By using wearable sensors, such as an egocentric camera, it is possible to monitor and objectively assess patients’ actual performance in activities of daily life (ADLs). We analyzed the possibility of using Deep Learning models for depth estimation based on a single RGB image to allow the monitoring of patients with 2D (RGB) cameras. We conducted experiments placing objects at different distances from the camera and varying the lighting conditions to evaluate the performance of the depth estimation provided by two deep learning models (MiDaS & Alhashim). Finally, we integrated the best performing model for depth-estimation (MiDaS) with other Deep Learning models for hand (MediaPipe) and object detection (YOLO) and evaluated the system in a task of hand-object interaction. Our tests showed that our final system has a 78% performance in detecting interactions, while the reference performance using a 3D (depth) camera is 84%.

Details

Database :
OpenAIRE
Journal :
Applied Sciences; Volume 12; Issue 13; Pages: 6578
Accession number :
edsair.doi.dedup.....38243c0a11bf493c125c0c0b41013d14