Scientific Journal of Astana IT University (Sep 2023)

GESTURE RECOGNITION OF MACHINE LEARNING AND CONVOLUTIONAL NEURAL NETWORK METHODS FOR KAZAKH SIGN LANGUAGE

  • Samat Mukhanov,
  • Raissa Uskenbayeva,
  • Young Im Cho,
  • Dauren Kabyl,
  • Nurzhan Les,
  • Maqsat Amangeldi

DOI
https://doi.org/10.37943/15LPCU4095

Abstract

Read online

Recently, there has been a growing interest in machine learning and neural networks among the public, largely due to advancements in technology which have led to improved methods of computer recognition of objects, sounds, texts, and other data types. As a result, human-computer interactions are becoming more natural and comprehensible to the average person. The progress in computer vision has enabled the use of increasingly sophisticated models for object recognition in images and videos, which can also be applied to recognize hand gestures. In this research, popular hand gesture recognition models, such as the Convolutional Neural Network (CNN), Long Short-Term Memory (LSTM), and Support Vector Machine (SVM) were examined. These models vary in their approaches, processing time, and training data size. The important feature of this research work is the use of various machine learning algorithms and methods such as CNN, LSTM, and SVM. Experiments showed different results when training neural networks for sign language recognition in the Kazakh sign language based on the dactyl alphabet. This article provides a detailed description of each method, their respective purposes, and effectiveness in terms of performance and training. Numerous experimental results were recorded in a table, demonstrating the accuracy of recognizing each gesture. Additionally, specific hand gestures were isolated for testing in front of the camera to recognize the gesture and display the result on the screen. An important feature was the use of mathematical formulas and functions to explain the working principle of the machine learning algorithm, as well as the logical scheme and structure of the LSTM algorithm.

Keywords