IEEE Access (Jan 2024)

MASTF-net: An EEG Emotion Recognition Network Based on Multi-Source Domain Adaptive Method Based on Spatio-Temporal Image and Frequency Domain Information

  • Hongxiang Xu,
  • Ziyi Pei,
  • Qi Han,
  • Mingyang Hou,
  • Xin Qian,
  • Tengfei Weng,
  • Yuan Tian,
  • Zicheng Qiu,
  • Baobing Zhou

DOI
https://doi.org/10.1109/ACCESS.2024.3349552
Journal volume & issue
Vol. 12
pp. 8485 – 8501

Abstract

Read online

In the field of neuroscience, the electroencephalogram (EEG) is a crucial indicator of emotion. The EEG emotion recognition method based on domain adaptation (DA) has good objectivity and high time resolution and is the preferred method to study the brain’s response to emotional stimuli. However, due to the obvious instability of EEG emotion characteristics, it is difficult to predict the emotion corresponding to EEG signals of cross-subjects by a model that combines all source domains into a single source. In order to solve the problem of cross-subject emotion analysis, we propose an EEG emotion recognition net with a cross-subject multi-source adaptive method (MASTF-net), where EEG features of different subjects are regarded as different domains. Through analyzing the invariance of the target domain and the uniqueness of the source domain, this method realizes the emotional analysis of different objects according to the spatio-temporal images and frequency domain information. First, features of EEG image are extracted from frequency and time dimensions. Secondly, combined with the serialized EEG frequency characteristics of local brain regions, independent classification module are established for different domains to recognize the emotion feature distribution of different subjects. In addition, a feature extraction method of differential entropy(DE) data of EEG is proposed based on frequency band division, which can provide stable feature input for our network structure. Finally, experiments are conducted on the SEED dataset. The experimental results show that our method has better classification accuracy in the experiment on the problem of cross multiple subjects. MASTF-net is superior to other relevant methods and models in multi-source domain. On the issue of cross subject emotion analysis, the highest accuracy of our method can reach $88.19\%$ .

Keywords