IEEE Access (Jan 2020)

Hand Gesture Recognition for Sign Language Using 3DCNN

  • Muneer Al-Hammadi,
  • Ghulam Muhammad,
  • Wadood Abdul,
  • Mansour Alsulaiman,
  • Mohamed A. Bencherif,
  • Mohamed Amine Mekhtiche

DOI
https://doi.org/10.1109/ACCESS.2020.2990434
Journal volume & issue
Vol. 8
pp. 79491 – 79509

Abstract

Read online

Recently, automatic hand gesture recognition has gained increasing importance for two principal reasons: the growth of the deaf and hearing-impaired population, and the development of vision-based applications and touchless control on ubiquitous devices. As hand gesture recognition is at the core of sign language analysis a robust hand gesture recognition system should consider both spatial and temporal features. Unfortunately, finding discriminative spatiotemporal descriptors for a hand gesture sequence is not a trivial task. In this study, we proposed an efficient deep convolutional neural networks approach for hand gesture recognition. The proposed approach employed transfer learning to beat the scarcity of a large labeled hand gesture dataset. We evaluated it using three gesture datasets from color videos: 40, 23, and 10 classes were used from these datasets. The approach obtained recognition rates of 98.12%, 100%, and 76.67% on the three datasets, respectively for the signer-dependent mode. For the signer-independent mode, it obtained recognition rates of 84.38%, 34.9%, and 70% on the three datasets, respectively.

Keywords