IEEE Transactions on Neural Systems and Rehabilitation Engineering (Jan 2023)

A Cross-Space CNN With Customized Characteristics for Motor Imagery EEG Classification

  • Ying Hu,
  • Yan Liu,
  • Siqi Zhang,
  • Ting Zhang,
  • Bin Dai,
  • Bo Peng,
  • Hongbo Yang,
  • Yakang Dai

DOI
https://doi.org/10.1109/TNSRE.2023.3249831
Journal volume & issue
Vol. 31
pp. 1554 – 1565

Abstract

Read online

The classification of motor imagery-electroencephalogram(MI-EEG)based brain-computer interface(BCI)can be used to decode neurological activities, which has been widely applied in the control of external devices. However, two factors still hinder the improvement of classification accuracy and robustness, especially in multi-class tasks. First, existing algorithms are based on a single space (measuring or source space). They suffer from the holistic low spatial resolution of the measuring space or the locally high spatial resolution information accessed from the source space, failing to provide holistic and high-resolution representations. Second, the subject specificity is not sufficiently characterized, resulting in the loss of personalized intrinsic information. Therefore, we propose a cross-space convolutional neural network (CS-CNN) with customized characteristics for four-class MI-EEG classification. This algorithm uses the modified customized band common spatial patterns (CBCSP) and duplex mean-shift clustering (DMSClustering) to express the specific rhythms and source distribution information in cross-space. At the same time, multi-view features from the time, frequency and space domains are extracted, connecting with CNN to fuse the characteristics from two spaces and classify them. MI-EEG was collected from 20 subjects. Lastly, the classification accuracy of the proposed is 96.05% with real MRI information and 94.79% without MRI in the private dataset. And the results in the BCI competition IV-2a show that CS-CNN outperforms the state-of-the-art algorithms, achieving an accuracy improvement of 1.98%, and a standard deviation reduction of 5.15%.

Keywords