Journal of Integrative Neuroscience (Jan 2024)

Fusion of Multi-domain EEG Signatures Improves Emotion Recognition

  • Xiaomin Wang,
  • Yu Pei,
  • Zhiguo Luo,
  • Shaokai Zhao,
  • Liang Xie,
  • Ye Yan,
  • Erwei Yin,
  • Shuang Liu,
  • Dong Ming

DOI
https://doi.org/10.31083/j.jin2301018
Journal volume & issue
Vol. 23, no. 1
p. 18

Abstract

Read online

Background: Affective computing has gained increasing attention in the area of the human-computer interface where electroencephalography (EEG)-based emotion recognition occupies an important position. Nevertheless, the diversity of emotions and the complexity of EEG signals result in unexplored relationships between emotion and multichannel EEG signal frequency, as well as spatial and temporal information. Methods: Audio-video stimulus materials were used that elicited four types of emotions (sad, fearful, happy, neutral) in 32 male and female subjects (age 21–42 years) while collecting EEG signals. We developed a multidimensional analysis framework using a fusion of phase-locking value (PLV), microstates, and power spectral densities (PSDs) of EEG features to improve emotion recognition. Results: An increasing trend of PSDs was observed as emotional valence increased, and connections in the prefrontal, temporal, and occipital lobes in high-frequency bands showed more differentiation between emotions. Transition probability between microstates was likely related to emotional valence. The average cross-subject classification accuracy of features fused by Discriminant Correlation Analysis achieved 64.69%, higher than that of single mode and direct-concatenated features, with an increase of more than 7%. Conclusions: Different types of EEG features have complementary properties in emotion recognition, and combining EEG data from three types of features in a correlated way, improves the performance of emotion classification.

Keywords