IEEE Access (Jan 2024)

Real-Time Dynamic Gesture Recognition Method Based on Gaze Guidance

  • Binbin Zhang,
  • Weiqing Li,
  • Zhiyong Su

DOI
https://doi.org/10.1109/ACCESS.2024.3482459
Journal volume & issue
Vol. 12
pp. 161084 – 161095

Abstract

Read online

The application of gesture recognition technology in human-computer interaction fields is widespread. However, issues such as the size of the model parameter space and the occurrence of false positives in real-world interactive scenarios persist. This paper proposes a real-time dynamic gesture recognition method based on gaze guidance, which utilizes gaze tracking data to accurately segment hand skeletal sequences. Furthermore, skeletal data is pre-processed with multidimensional feature extraction. Finally, a lightweight multi-feature fusion recognition network is employed for the gesture recognition. Experiments on public datasets and simulated interaction scenarios demonstrate that the proposed method achieves higher recognition accuracy than mainstream methods, using only approximately 0.15M parameters and 3ms inference time. In particular, in the simulated 14-class and 28-class dynamic gesture recognition tasks, it obtains Levenshtein accuracies of 95.9% and 94.5%, respectively, which are approximately 20% higher than those of mainstream methods.

Keywords