ICT Express (Dec 2023)

EMG-based 3D hand gesture prediction using transformer–encoder classification

  • Tahira Mahboob,
  • Min Young Chung,
  • Kae Won Choi

Journal volume & issue
Vol. 9, no. 6
pp. 1047 – 1052

Abstract

Read online

One of the most common task of electromyography (EMG)-based human–machine interface (HMI) is the hand gesture recognition. Robust and accurate predictions leveraging the surface EMG (sEMG) is a key challenge. In this paper, we present a 3D hand gesture prediction application, leveraging the sEMG signal and the optical hand tracking information. A transformer–encoder classifier (TEC) module is introduced in an IPC-system to predict the 3D-hand gestures using eight monopolar channels from sEMG as input. An experimental testbed is setup to acquire, train, and predict the 3D hand gestures within a feasible range of performance. The performance has been evaluated in terms of percentage of correctly classified keypoints (PCK). PCK is measured by first estimating the euclidean distance between the actual and the predicted keypoints. The percentage of keypoints within a threshold distance value are then calculated. Results from the ablation study indicate that the proposed scheme shows a percentage of correctly classified keypoints of up to 72.8%, 92.7%, 97.2%, and 98.6% with a PCK threshold of 5 mm, 10 mm, 20 mm, and 30 mm, respectively.

Keywords