Scientific Reports (Jun 2024)

Simulation-driven design of smart gloves for gesture recognition

  • Clayton Leite,
  • Petr Byvshev,
  • Henry Mauranen,
  • Yu Xiao

DOI
https://doi.org/10.1038/s41598-024-65069-2
Journal volume & issue
Vol. 14, no. 1
pp. 1 – 22

Abstract

Read online

Abstract Smart gloves are in high demand for entertainment, manufacturing, and rehabilitation. However, designing smart gloves has been complex and costly due to trial and error. We propose an open simulation platform for designing smart gloves, including optimal sensor placement and deep learning models for gesture recognition, with reduced costs and manual effort. Our pipeline starts with 3D hand pose extraction from videos and extends to the refinement and conversion of the poses into hand joint angles based on inverse kinematics, the sensor placement optimization based on hand joint analysis, and the training of deep learning models using simulated sensor data. In comparison to the existing platforms that always require precise motion data as input, our platform takes monocular videos, which can be captured with widely available smartphones or web cameras, as input and integrates novel approaches to minimize the impact of the errors induced by imprecise motion extraction from videos. Moreover, our platform enables more efficient sensor placement selection. We demonstrate how the pipeline works and how it delivers a sensible design for smart gloves in a real-life case study. We also evaluate the performance of each building block and its impact on the reliability of the generated design.

Keywords