Sensors (Feb 2020)

A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods

  • Weiya Chen,
  • Chenchen Yu,
  • Chenyu Tu,
  • Zehua Lyu,
  • Jing Tang,
  • Shiqi Ou,
  • Yan Fu,
  • Zhidong Xue

DOI
https://doi.org/10.3390/s20041074
Journal volume & issue
Vol. 20, no. 4
p. 1074

Abstract

Read online

Real-time sensing and modeling of the human body, especially the hands, is an important research endeavor for various applicative purposes such as in natural human computer interactions. Hand pose estimation is a big academic and technical challenge due to the complex structure and dexterous movement of human hands. Boosted by advancements from both hardware and artificial intelligence, various prototypes of data gloves and computer-vision-based methods have been proposed for accurate and rapid hand pose estimation in recent years. However, existing reviews either focused on data gloves or on vision methods or were even based on a particular type of camera, such as the depth camera. The purpose of this survey is to conduct a comprehensive and timely review of recent research advances in sensor-based hand pose estimation, including wearable and vision-based solutions. Hand kinematic models are firstly discussed. An in-depth review is conducted on data gloves and vision-based sensor systems with corresponding modeling methods. Particularly, this review also discusses deep-learning-based methods, which are very promising in hand pose estimation. Moreover, the advantages and drawbacks of the current hand gesture estimation methods, the applicative scope, and related challenges are also discussed.

Keywords