Applied Sciences (Jul 2022)

Hand Gesture Recognition via Lightweight VGG16 and Ensemble Classifier

  • Edmond Li Ren Ewe,
  • Chin Poo Lee,
  • Lee Chung Kwek,
  • Kian Ming Lim

DOI
https://doi.org/10.3390/app12157643
Journal volume & issue
Vol. 12, no. 15
p. 7643

Abstract

Read online

Gesture recognition has been studied for a while within the fields of computer vision and pattern recognition. A gesture can be defined as a meaningful physical movement of the fingers, hands, arms, or other parts of the body with the purpose to convey information for the environment interaction. For instance, hand gesture recognition (HGR) can be used to recognize sign language which is the primary means of communication by the deaf and mute. Vision-based HGR is critical in its application; however, there are challenges that will need to be overcome such as variations in the background, illuminations, hand orientation and size and similarities among gestures. The traditional machine learning approach has been widely used in vision-based HGR in recent years but the complexity of its processing has been a major challenge—especially on the handcrafted feature extraction. The effectiveness of the handcrafted feature extraction technique was not proven across various datasets in comparison to deep learning techniques. Therefore, a hybrid network architecture dubbed as Lightweight VGG16 and Random Forest (Lightweight VGG16-RF) is proposed for vision-based hand gesture recognition. The proposed model adopts feature extraction techniques via the convolutional neural network (CNN) while using the machine learning method to perform classification. Experiments were carried out on publicly available datasets such as American Sign Language (ASL), ASL Digits and NUS Hand Posture dataset. The experimental results demonstrate that the proposed model, a combination of lightweight VGG16 and random forest, outperforms other methods.

Keywords