Heliyon (Oct 2024)
Virtual reality-enabled high-performance emotion estimation with the most significant channel pairs
Abstract
Human-computer interface (HCI) and electroencephalogram (EEG) signals are widely used in user experience (UX) interface designs to provide immersive interactions with the user. In the context of UX, EEG signals can be used within a metaverse system to assess user engagement, attention, emotional responses, or mental workload. By analyzing EEG signals, system designers can tailor the virtual environment, content, or interactions in real time to optimize UX, improve immersion, and personalize interactions. However, in this case, in addition to the signals' processing cost and classification accuracy, cybersickness in Virtual Reality (VR) systems needs to be resolved. At this point, channel selection methods can perform better for HCI and UX applications by reducing noisy and redundant information in generally unrelated EEG channels. For this purpose, a new method for EEG channel selection based on phase-locking value (PLV) analysis is proposed. We hypothesized that there are interactions between EEG channels in terms of PLV in repeated tasks in different trials of the emotion estimation experiment. Subsequently, frequency-based features were extracted. The features were classified by dividing them into bags using the Multiple-Instance Learning (MIL) variant. This study provides higher classification performance using fewer EEG channels for emotion prediction. The performance rate obtained in binary classification with the Random Forests (RF) algorithm is at a promising level of 99%. The proposed method achieved an accuracy of 99.38% for valence using all channels on the new dataset (VREMO) and 98.13% with channel selection. The benchmark dataset (DEAP) achieved accuracies of 98.16% using all channels and 98.13% with selected channels.