Sensors (Jul 2023)

Dataglove for Sign Language Recognition of People with Hearing and Speech Impairment via Wearable Inertial Sensors

  • Ang Ji,
  • Yongzhen Wang,
  • Xin Miao,
  • Tianqi Fan,
  • Bo Ru,
  • Long Liu,
  • Ruicheng Nie,
  • Sen Qiu

DOI
https://doi.org/10.3390/s23156693
Journal volume & issue
Vol. 23, no. 15
p. 6693

Abstract

Read online

Finding ways to enable seamless communication between deaf and able-bodied individuals has been a challenging and pressing issue. This paper proposes a solution to this problem by designing a low-cost data glove that utilizes multiple inertial sensors with the purpose of achieving efficient and accurate sign language recognition. In this study, four machine learning models—decision tree (DT), support vector machine (SVM), K-nearest neighbor method (KNN), and random forest (RF)—were employed to recognize 20 different types of dynamic sign language data used by deaf individuals. Additionally, a proposed attention-based mechanism of long and short-term memory neural networks (Attention-BiLSTM) was utilized in the process. Furthermore, this study verifies the impact of the number and position of data glove nodes on the accuracy of recognizing complex dynamic sign language. Finally, the proposed method is compared with existing state-of-the-art algorithms using nine public datasets. The results indicate that both the Attention-BiLSTM and RF algorithms have the highest performance in recognizing the twenty dynamic sign language gestures, with an accuracy of 98.85% and 97.58%, respectively. This provides evidence for the feasibility of our proposed data glove and recognition methods. This study may serve as a valuable reference for the development of wearable sign language recognition devices and promote easier communication between deaf and able-bodied individuals.

Keywords