Back to Search Start Over

The Classification of Abnormal Hand Movement to Aid in Autism Detection: Machine Learning Study

Authors :
Anish Lakkapragada
Aaron Kline
Onur Cezmi Mutlu
Kelley Paskov
Brianna Chrisman
Nathaniel Stockham
Peter Washington
Dennis Paul Wall
Source :
JMIR Biomedical Engineering, Vol 7, Iss 1, p e33771 (2022)
Publication Year :
2022
Publisher :
JMIR Publications, 2022.

Abstract

BackgroundA formal autism diagnosis can be an inefficient and lengthy process. Families may wait several months or longer before receiving a diagnosis for their child despite evidence that earlier intervention leads to better treatment outcomes. Digital technologies that detect the presence of behaviors related to autism can scale access to pediatric diagnoses. A strong indicator of the presence of autism is self-stimulatory behaviors such as hand flapping. ObjectiveThis study aims to demonstrate the feasibility of deep learning technologies for the detection of hand flapping from unstructured home videos as a first step toward validation of whether statistical models coupled with digital technologies can be leveraged to aid in the automatic behavioral analysis of autism. To support the widespread sharing of such home videos, we explored privacy-preserving modifications to the input space via conversion of each video to hand landmark coordinates and measured the performance of corresponding time series classifiers. MethodsWe used the Self-Stimulatory Behavior Dataset (SSBD) that contains 75 videos of hand flapping, head banging, and spinning exhibited by children. From this data set, we extracted 100 hand flapping videos and 100 control videos, each between 2 to 5 seconds in duration. We evaluated five separate feature representations: four privacy-preserved subsets of hand landmarks detected by MediaPipe and one feature representation obtained from the output of the penultimate layer of a MobileNetV2 model fine-tuned on the SSBD. We fed these feature vectors into a long short-term memory network that predicted the presence of hand flapping in each video clip. ResultsThe highest-performing model used MobileNetV2 to extract features and achieved a test F1 score of 84 (SD 3.7; precision 89.6, SD 4.3 and recall 80.4, SD 6) using 5-fold cross-validation for 100 random seeds on the SSBD data (500 total distinct folds). Of the models we trained on privacy-preserved data, the model trained with all hand landmarks reached an F1 score of 66.6 (SD 3.35). Another such model trained with a select 6 landmarks reached an F1 score of 68.3 (SD 3.6). A privacy-preserved model trained using a single landmark at the base of the hands and a model trained with the average of the locations of all the hand landmarks reached an F1 score of 64.9 (SD 6.5) and 64.2 (SD 6.8), respectively. ConclusionsWe created five lightweight neural networks that can detect hand flapping from unstructured videos. Training a long short-term memory network with convolutional feature vectors outperformed training with feature vectors of hand coordinates and used almost 900,000 fewer model parameters. This study provides the first step toward developing precise deep learning methods for activity detection of autism-related behaviors.

Subjects

Subjects :
Medical technology
R855-855.5

Details

Language :
English
ISSN :
25613278
Volume :
7
Issue :
1
Database :
Directory of Open Access Journals
Journal :
JMIR Biomedical Engineering
Publication Type :
Academic Journal
Accession number :
edsdoj.35b11c21fad4bb0ad83918245daa711
Document Type :
article
Full Text :
https://doi.org/10.2196/33771