PLoS ONE (Jan 2019)

Quantifying normal and parkinsonian gait features from home movies: Practical application of a deep learning-based 2D pose estimator.

  • Kenichiro Sato,
  • Yu Nagashima,
  • Tatsuo Mano,
  • Atsushi Iwata,
  • Tatsushi Toda

DOI
https://doi.org/10.1371/journal.pone.0223549
Journal volume & issue
Vol. 14, no. 11
p. e0223549

Abstract

Read online

OBJECTIVE:Gait movies recorded in daily clinical practice are usually not filmed with specific devices, which prevents neurologists benefitting from leveraging gait analysis technologies. Here we propose a novel unsupervised approach to quantifying gait features and to extract cadence from normal and parkinsonian gait movies recorded with a home video camera by applying OpenPose, a deep learning-based 2D-pose estimator that can obtain joint coordinates from pictures or videos recorded with a monocular camera. METHODS:Our proposed method consisted of two distinct phases: obtaining sequential gait features from movies by extracting body joint coordinates with OpenPose; and estimating cadence of periodic gait steps from the sequential gait features using the short-time pitch detection approach. RESULTS:The cadence estimation of gait in its coronal plane (frontally viewed gait) as is frequently filmed in the daily clinical setting was successfully conducted in normal gait movies using the short-time autocorrelation function (ST-ACF). In cases of parkinsonian gait with prominent freezing of gait and involuntary oscillations, using ACF-based statistical distance metrics, we quantified the periodicity of each gait sequence; this metric clearly corresponded with the subjects' baseline disease statuses. CONCLUSION:The proposed method allows us to analyze gait movies that have been underutilized to date in a completely data-driven manner, and might broaden the range of movies for which gait analyses can be conducted.