Scientific Reports (Dec 2022)
Unsupervised adaptation of an ECoG based brain–computer interface using neural correlates of task performance
Abstract
Abstract Brain–computer interfaces (BCIs) translate brain signals into commands to external effectors, and mainly target severely disabled users. The usability of BCIs may be improved by reducing their major constraints, such as the necessity for special training sessions to initially calibrate and later keep up to date the neural signal decoders. In this study, we show that it is possible to train and update BCI decoders during free use of motor BCIs. In addition to the neural signal decoder controlling effectors (control decoder), one more classifier is proposed to detect neural correlates of BCI motor task performances (MTP). MTP decoders reveal whether the actions performed by BCI effectors matched the user’s intentions. The combined outputs of MTP and control decoders allow forming training datasets to update the control decoder online and in real time during free use of BCIs. The usability of the proposed auto-adaptive BCI (aaBCI) is demonstrated for two principle BCIs paradigms: with discrete outputs (4 classes BCI, virtual 4-limb exoskeleton), and with continuous outputs (cursor 2D control). The proof of concept was performed in an online simulation study using an ECoG dataset collected from a tetraplegic during a BCI clinical trial. The control decoder reached a multiclass area under the ROC curve of 0.7404 using aaBCI, compared to a chance level of 0.5173 and to 0.8187 for supervised training for the multiclass BCI, and a cosine similarity of 0.1211 using aaBCI, compared to a chance level of 0.0036 and to 0.2002 for supervised training for the continuous BCI.