IEEE Access (Jan 2020)
Real-Time EEG–EMG Human–Machine Interface-Based Control System for a Lower-Limb Exoskeleton
Abstract
This article presents a rehabilitation technique based on a lower-limb exoskeleton integrated with a human-machine interface (HMI). HMI is used to record and process multimodal signals collected using a foot motor imagery (MI)-based brain-machine interface (BMI) and multichannel electromyographic (EMG) signals recorded from leg muscles. Current solutions of HMI-equipped rehabilitation assistive technologies tested under laboratory conditions demonstrated a great deal of success, but faced several difficulties caused by the limited accuracy of detecting MI electroencephalography (EEG) and the reliability of online control when executing a movement by patients dressed in an exoskeleton. In the case of lower-limb representation, there is still the problem of reliably distinguishing leg movement intentions and differentiating them in BMI systems. Targeting the design of a rehabilitation technique replicating the natural mode of motor control in exoskeleton walking patients, we have shown how the combined use of multimodal signals can improve the accuracy, performance, and reliability of HMI. The system was tested on healthy subjects operating the exoskeleton under different conditions. The study also resulted in algorithms of multimodal HMI data collection, processing, and classification. The developed system can analyze up to 15 signals simultaneously in real-time during a movement. Foot MI is extracted from EEG signals (seven channels) using the event-related (de)synchronization effect. Supplemented by EMG signals reflecting motor intention, the control system can initiate and differentiate the movement of the right and left legs with a high degree of reliability. The classification and control system permits one to work online when the exoskeleton is executing a movement.
Keywords