IEEE Access (Jan 2024)
A Novel Mems and Flex Sensor-Based Hand Gesture Recognition and Regenerating System Using Deep Learning Model
Abstract
This article presents a wearable glove mounted flex sensors and a MEMS-based accelerometer array for detecting hand movements to sign language recognition. The functionality and performance of the glove were extensively evaluated for repeatability across different hand gestures (numbers 1–5 and 10 American Sign Language figures) while obtaining real-time raw data. The multilayer perceptron feed-forward neural network (MLPFFNN) was chosen as the specific artificial neural network (ANN) algorithm to determine the gestures. To create a comprehensive database of hand movements, both flex sensor and accelerometer data were used to generate pulse width modulation (PWM) values, which served as input to the model. A total of 5204 data points, including acceleration (ACC) and flex sensor values, were recorded for model training and movement detection (with 75% of the data used for training and 25% for testing). The predicted values of the model were compared with the actual values and analyzed statistically. The output data from the model were then transferred to a developed robotic hand platform to test the accuracy, and the movements were observed. It was found that the original hand movements and the model-generated robotic hand movements were quite similar. When compared with existing methods, the proposed method was observed to improve the accuracy of sign language recognition and enhance the tracking of hand movements. This article presents statistical results with a classification accuracy of 99.67% based on measured test data for various recognition scenarios.
Keywords