IEEE Access (Jan 2021)

A Deep Learning Approach for Brain Computer Interaction-Motor Execution EEG Signal Classification

  • Nesma E. Elsayed,
  • Ahmed S. Tolba,
  • Magdi Z. Rashad,
  • Tamer Belal,
  • Shahenda Sarhan

DOI
https://doi.org/10.1109/ACCESS.2021.3097797
Journal volume & issue
Vol. 9
pp. 101513 – 101529

Abstract

Read online

Recently Noninvasive Electroencephalogram (EEG) systems are gaining much attention. Brain-computer Interface (BCI) systems rely on EEG analysis to identify the mental state of the user, change in cognitive state and response to the events. Motor Execution (ME) is a very important control paradigm. This paper introduces a robust and useful User-Independent Hybrid Brain-computer Interface (UIHBCI) model to classify signals from fourteen EEG channels that are used to record the reactions of the brain neurons of nine subjects. Through this study the researchers identified relevant multisensory features of multi-channel EEG that represent the specific mental processes depending on two different evaluation models (Audio/Video) and (Male/Female). The Deep Belief Network (DBN) was applied independently on the two models where, the overall achieved classification rates were better in ME classification compared to the state of art. For evaluation four models were tested in addition to the proposed model, Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), Brain-computer Interface Lower-Limb Motor Recovery (BCI LLMR) and Hybrid Steady-State Visual Evoked Potential Rapid Serial Visual Presentation Brain-computer Interface (Hybrid SSVEP-RSVP BCI). Results indicated the proposed model, LDA, SVM, BCI LLMR and Hybrid SSVEP-RSVP BCI accuracies for (A/V) model are 94.44%, 66.67%, 61.11%, 83.33% and 89.67% respectively, while for (M/F) model, the overall accuracies are 94.44%, 88.89%, 83.331%, 85.44% and 89.45%. Finally, the proposed model achieved superiority over the state of art algorithms in both (A/V) and (M/F) models.

Keywords