1. Quantification of tremor dynamics via video-based analysis.
- Author
-
Lee, Seung-Hwan, Lee, Dongseop, Park, Jihoon, Shim, Jae-Min, and Kim, Baeksop
- Subjects
MACHINE learning ,ESSENTIAL tremor ,SUPPORT vector machines ,K-nearest neighbor classification ,DECISION trees ,RANDOM forest algorithms ,MULTILAYER perceptrons ,NAIVE Bayes classification - Abstract
Background & purpose: Tremor is a common movement disorder diagnosed employing electrophysiological methods. Today, machine learning (ML) algorithms can efficiently analyze image-based data. Thus, we subjected the dynamics of essential tremor (ET) to video-based analysis. Methods: We enrolled 59 ET patients and 48 age-matched normal controls. The Clinical Rating Scale for Tremor was used to score tremors. All subjects used a smartphone to record an image designed especially for this study while both stationary and in motion. The trajectories were divided into lower bandpass-filtered and bandpass-filtered (BPF) groups based on the frequency. We extracted seven trajectory features, including the angle, velocity, homogeneity, pitch, power, entropy, and cosine. We used Student's t-test to compare the features of the ET patients and normal controls. A Random Forest model was employed to rank feature importance. Five ML models (random forest, k-nearest neighbors, support vector machine, decision tree, and multi-layer perceptron) were applied to estimate diagnostic accuracy. Results: Significant differences in most of the features of the BPF signals were evident between the two groups. The velocity and homogeneity of the BPF trajectory were highest in the stationary and motion phases, respectively. The highest accuracy levels in the stationary, motion, and combined phases for predicting ET were 0.901, 0.757, and 0.892, respectively. Conclusions: Features of ET tremor were evident in image-based data, enabling analysis of the tremor dynamics. ML analyses distinguished ET subjects from normal controls; however, more research is needed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF