IEEE Access (Jan 2023)

Smart Healthcare Hand Gesture Recognition Using CNN-Based Detector and Deep Belief Network

  • Mohammed Alonazi,
  • Hira Ansar,
  • Naif Al Mudawi,
  • Saud S. Alotaibi,
  • Nouf Abdullah Almujally,
  • Abdulwahab Alazeb,
  • Ahmad Jalal,
  • Jaekwang Kim,
  • Moohong Min

DOI
https://doi.org/10.1109/ACCESS.2023.3289389
Journal volume & issue
Vol. 11
pp. 84922 – 84933

Abstract

Read online

Gesture recognition in dynamic images is challenging in computer vision, automation and medical field. Hand gesture tracking and recognition between both human and computer must have symmetry in real world. With advances in sensor technology, numerous researchers have recently proposed RGB gesture recognition techniques. In our research paper, we introduce a reliable hand gesture tracking and recognition model that is accurate despite any complex environment, it can track and recognise RGB dynamic gestures. Firstly, videos are converted into frames. After images light intensity adjustment and noise removal, images are passed through CNN for hand gesture extraction. Then from the extracted hand, features are extracted from full hand. Neural gas and locomotion thermal mapping are extracted to make the feature vector. The feature vector are then passed through the fuzzy optimiser to reduce the uncertainties and the fuzziness. The optimised features are then passed to the classifier Deep Belief Network (DBW) for the classification of the gestures. Egogesture and Jester datasets are used for the validation of proposed systems. The experimental results over Egogesture and Jester datasets demonstrate overall accuracies of 90.73% and 89.33% respectively. The experiments proves our system readability and suitability of our proposed model with the other state of the arts model.

Keywords