Scientific Reports (Jul 2022)

An integrated mediapipe-optimized GRU model for Indian sign language recognition

  • Barathi Subramanian,
  • Bekhzod Olimov,
  • Shraddha M. Naik,
  • Sangchul Kim,
  • Kil-Houm Park,
  • Jeonghong Kim

DOI
https://doi.org/10.1038/s41598-022-15998-7
Journal volume & issue
Vol. 12, no. 1
pp. 1 – 16

Abstract

Read online

Abstract Sign language recognition is challenged by problems, such as accurate tracking of hand gestures, occlusion of hands, and high computational cost. Recently, it has benefited from advancements in deep learning techniques. However, these larger complex approaches cannot manage long-term sequential data and they are characterized by poor information processing and learning efficiency in capturing useful information. To overcome these challenges, we propose an integrated MediaPipe-optimized gated recurrent unit (MOPGRU) model for Indian sign language recognition. Specifically, we improved the update gate of the standard GRU cell by multiplying it by the reset gate to discard the redundant information from the past in one screening. By obtaining feedback from the resultant of the reset gate, additional attention is shown to the present input. Additionally, we replace the hyperbolic tangent activation in standard GRUs with exponential linear unit activation and SoftMax with Softsign activation in the output layer of the GRU cell. Thus, our proposed MOPGRU model achieved better prediction accuracy, high learning efficiency, information processing capability, and faster convergence than other sequential models.