IEEE Transactions on Neural Systems and Rehabilitation Engineering (Jan 2022)

Real-Time Multiple Gesture Recognition: Application of a Lightweight Individualized 1D CNN Model to an Edge Computing System

  • Zengjie Yu,
  • Yiqian Lu,
  • Qi An,
  • Cong Chen,
  • Ye Li,
  • Yishan Wang

DOI
https://doi.org/10.1109/TNSRE.2022.3165858
Journal volume & issue
Vol. 30
pp. 990 – 998

Abstract

Read online

The human–machine interface (HMI) detects electrophysiological signals from the subject and controls the machine based on the signal information. However, most applications are still only in the testing stage and are generally unavailable to the public. In recent years, researchers have been devoted to making wearable HMI devices smarter and more comfortable. In this study, a wearable, intelligent eight-channel electromyography (EMG) signal–based system was designed to recognize 21 types of gestures. An analog front end (AFE) integrated chip (IC) was developed to detect the EMG signals, and an integrated EMG signal acquisition device integrating an elastic armband was fabricated. An SIAT database of 21 gestures was established by collecting EMG gesture signals from 10 volunteers. A lightweight 1D CNN model was constructed and subjected to individualized training by using the SIAT database. The maximum signal recognition accuracy was 89.96%, and the average model training time was 14 min 13 s. Given its small size, the model can be applied on lower-performance edge computing devices and is expected to be applied to smartphone terminals in the future. The source code is available at https://github.com/Siat-F9/EMG-Tools.

Keywords