IEEE Access (Jan 2024)

Cross-Subject EEG-Based Emotion Recognition Using Deep Metric Learning and Adversarial Training

  • Hawraa Razzaq Abed Alameer,
  • Pedram Salehpour,
  • Seyyed Hadi Aghdasi,
  • Mohammad-Reza Feizi-Derakhshi

DOI
https://doi.org/10.1109/ACCESS.2024.3458833
Journal volume & issue
Vol. 12
pp. 130241 – 130252

Abstract

Read online

Nowadays, due to individual differences and the non-stationarity properties of EEG signals, developing an accurate cross-subject EEG emotion recognition method is in demand. Despite many successful attempts, the accuracy of generalized models across subjects is inferior compared to those limited to a specific individual. Moreover, most cross-subject training methods assume that the unlabeled data from target subjects is available. However, this assumption does not hold in practice. To address these issues, this paper presents a novel deep similarity learning loss specific to the emotion recognition task. This loss function minimizes intra-emotion class variations of EEG segments with different subject labels while maximizing inter-emotion class variations. Another key aspect of the proposed semantic embedding loss is that it preserves the order of emotion classes in the learned embedding. Specifically, it ensures that the embedding space maintains the semantic order of emotions. Also, we integrate the deep similarity learning module with adversarial learning, which helps to learn a subject-invariant representation of EEG signals in an end-to-end training paradigm. We conduct several experiments on three widely used datasets: SEED, SEED-GER, and DEAP. The results confirm that the proposed method effectively learns a subject invariant representation from EEG signals and consistently outperforms the state-of-the-art (SOTA) peer methods.

Keywords