1. Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves
- Author
-
Tashakori, Arvin, Jiang, Zenan, Servati, Amir, Soltanian, Saeid, Narayana, Harishkumar, Le, Katherine, Nakayama, Caroline, Yang, Chieh-ling, Wang, Z. Jane, Eng, Janice J., and Servati, Peyman
- Subjects
Computer Science - Human-Computer Interaction ,Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Machine Learning ,Computer Science - Robotics ,Electrical Engineering and Systems Science - Signal Processing - Abstract
Accurate real-time tracking of dexterous hand movements and interactions has numerous applications in human-computer interaction, metaverse, robotics, and tele-health. Capturing realistic hand movements is challenging because of the large number of articulations and degrees of freedom. Here, we report accurate and dynamic tracking of articulated hand and finger movements using stretchable, washable smart gloves with embedded helical sensor yarns and inertial measurement units. The sensor yarns have a high dynamic range, responding to low 0.005 % to high 155 % strains, and show stability during extensive use and washing cycles. We use multi-stage machine learning to report average joint angle estimation root mean square errors of 1.21 and 1.45 degrees for intra- and inter-subjects cross-validation, respectively, matching accuracy of costly motion capture cameras without occlusion or field of view limitations. We report a data augmentation technique that enhances robustness to noise and variations of sensors. We demonstrate accurate tracking of dexterous hand movements during object interactions, opening new avenues of applications including accurate typing on a mock paper keyboard, recognition of complex dynamic and static gestures adapted from American Sign Language and object identification.
- Published
- 2024
- Full Text
- View/download PDF