Open Research Europe (Jan 2025)
JointTracker: Real-time inertial kinematic chain tracking with joint position estimation [version 2; peer review: 1 approved, 2 approved with reservations]
Abstract
In-field motion capture is drawing increasing attention due to the multitude of application areas, in particular for human motion capture (HMC). Plenty of research is currently invested in camera-based markerless HMC, however, with the inherent drawbacks of limited field of view and occlusions. In contrast, inertial motion capture does not suffer from occlusions, thus being a promising approach for capturing motion outside the laboratory. However, one major challenge of such methods is the necessity of spatial registration. Typically, during a predefined calibration sequence, the orientation and location of each inertial sensor are registered with respect to an underlying skeleton model. This work contributes to calibration-free inertial motion capture, as it proposes a recursive estimator for the simultaneous online estimation of all sensor poses and joint positions of a kinematic chain model like the human skeleton. The full derivation from an optimization objective is provided. The approach can directly be applied to a synchronized data stream from a body-mounted inertial sensor network. Successful evaluations are demonstrated on noisy simulated data from a three-link chain, real lower-body walking data from 25 young, healthy persons, and walking data captured from a humanoid robot.