Proceedings of the XXth Conference of Open Innovations Association FRUCT (Nov 2024)

Eye Movement Assessment Methodology Based on Wearable EEG Headband Data Analysis

  • Vladimir Romaniuk,
  • Alexey Kashevnik

DOI
https://doi.org/10.23919/FRUCT64283.2024.10749882
Journal volume & issue
Vol. 36, no. 1
pp. 675 – 680

Abstract

Read online

This study investigates the classification of eye movements using EEG data recorded from a wearable device, with eye tracking data employed as ground truth for model training. We aim to classify various eye movements, including fixations, saccades, and directional movements, utilizing long short-term memory (LSTM) neural networks. Data were collected from 22 participants using the BrainBit headband, which recorded EEG signals at 250 Hz with four dry electrodes, and the Pupil Labs Invisible eye tracker, which recorded 2D gaze coordinates at 100 Hz during computer-based tasks. The EEG data underwent preprocessing and feature extraction to capture essential characteristics relevant to eye movement classification. Our LSTM model, trained and validated on this dataset, achieved a classification accuracy of 90% for the saccade detection task and 68% and 62% for up versus down and left versus right movement classification accordingly. These results demonstrate the potential of using EEG data alone for reliable eye movement classification, laying the groundwork for future research in neural signal processing and its applications in human-computer interaction and neurotechnological systems.

Keywords