Back to Search Start Over

Using deep neural networks for kinematic analysis: Challenges and opportunities

Authors :
Neil J. Cronin
Publication Year :
2021
Publisher :
Elsevier, 2021.

Abstract

Kinematic analysis is often performed in a lab using optical cameras combined with reflective markers.\ud With the advent of artificial intelligence techniques such as deep neural networks, it is now possible\ud to perform such analyses without markers, making outdoor applications feasible. In this paper I summarise\ud 2D markerless approaches for estimating joint angles, highlighting their strengths and limitations.\ud In computer science, so-called ‘‘pose estimation” algorithms have existed for many years. These methods\ud involve training a neural network to detect features (e.g. anatomical landmarks) using a process called\ud supervised learning, which requires ‘‘training” images to be manually annotated. Manual labelling has\ud several limitations, including labeller subjectivity, the requirement for anatomical knowledge, and issues\ud related to training data quality and quantity. Neural networks typically require thousands of training\ud examples before they can make accurate predictions, so training datasets are usually labelled by multiple\ud people, each of whom has their own biases, which ultimately affects neural network performance. A\ud recent approach, called transfer learning, involves modifying a model trained to perform a certain task\ud so that it retains some learned features and is then re-trained to perform a new task. This can drastically\ud reduce the required number of training images. Although development is ongoing, existing markerless\ud systems may already be accurate enough for some applications, e.g. coaching or rehabilitation.\ud Accuracy may be further improved by leveraging novel approaches and incorporating realistic physiological\ud constraints, ultimately resulting in low-cost markerless systems that could be deployed both in and\ud outside of the lab.

Details

Language :
English
ISSN :
00219290
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....7a7d6f9fdd7b9c88a948f019fca1964d