IEEE Access (Jan 2024)

Emotion Recognition in EEG Based on Multilevel Multidomain Feature Fusion

  • Zhao Long Li,
  • Hui Cao,
  • Ji Sai Zhang

DOI
https://doi.org/10.1109/ACCESS.2024.3417525
Journal volume & issue
Vol. 12
pp. 87237 – 87247

Abstract

Read online

In emotion recognition tasks, electroencephalography (EEG) has gained significant favor among researchers as a powerful biological signal tool. However, existing studies often fail to fully utilize the high temporal resolution provided by EEG when combining spatiotemporal and frequency features for emotion recognition, and do not meet the needs of effective feature fusion. Therefore, this paper proposes a multilevel multidomain feature fusion network model called MMF-Net, aiming to obtain a more comprehensive representation of spatiotemporal-frequency features and achieve higher accuracy in emotion classification. The model takes the original EEG two-dimensional feature map as input, simultaneously extracting spatiotemporal and spatial-frequency domain features at different levels to effectively utilize temporal resolution. Next, at each level, a specially designed fusion network layer is employed to combine the captured temporal, spatial, and frequency domain features. In addition, the fusion network layer contributes positively to the convergence of the model and the enhancement of feature detectors. In subject-dependent experiments, MMF-Net achieved average accuracy rates of 99.50% and 99.59% for valence and arousal dimensions on the DEAP dataset, respectively. In subject-independent experiments, the average accuracy rates for these two dimensions reached 97.46% and 97.54%, respectively.

Keywords