Sensors (Jan 2020)

An Exploration of Machine-Learning Estimation of Ground Reaction Force from Wearable Sensor Data

  • Danica Hendry,
  • Ryan Leadbetter,
  • Kristoffer McKee,
  • Luke Hopper,
  • Catherine Wild,
  • Peter O’Sullivan,
  • Leon Straker,
  • Amity Campbell

DOI
https://doi.org/10.3390/s20030740
Journal volume & issue
Vol. 20, no. 3
p. 740

Abstract

Read online

This study aimed to develop a wearable sensor system, using machine-learning models, capable of accurately estimating peak ground reaction force (GRF) during ballet jumps in the field. Female dancers (n = 30) performed a series of bilateral and unilateral ballet jumps. Dancers wore six ActiGraph Link wearable sensors (100 Hz). Data were collected simultaneously from two AMTI force platforms and synchronised with the ActiGraph data. Due to sensor hardware malfunctions and synchronisation issues, a multistage approach to model development, using a reduced data set, was taken. Using data from the 14 dancers with complete multi-sensor synchronised data, the best single sensor was determined. Subsequently, the best single sensor model was refined and validated using all available data for that sensor (23 dancers). Root mean square error (RMSE) in body weight (BW) and correlation coefficients (r) were used to assess the GRF profile, and Bland−Altman plots were used to assess model peak GRF accuracy. The model based on sacrum data was the most accurate single sensor model (unilateral landings: RMSE = 0.24 BW, r = 0.95; bilateral landings: RMSE = 0.21 BW, r = 0.98) with the refined model still showing good accuracy (unilateral: RMSE = 0.42 BW, r = 0.80; bilateral: RMSE = 0.39 BW, r = 0.92). Machine-learning models applied to wearable sensor data can provide a field-based system for GRF estimation during ballet jumps.

Keywords