IEEE Access (Jan 2024)

Learning Multiband-Temporal-Spatial EEG Representations of Emotions Using Lightweight Temporal Convolution and 3D Convolutional Neural Network

  • Fengjie Wu,
  • Jiarui Liu,
  • Jisen Yang,
  • Lihan Zhang,
  • Yan He,
  • Zhaolong Lin

DOI
https://doi.org/10.1109/ACCESS.2024.3460393
Journal volume & issue
Vol. 12
pp. 132016 – 132026

Abstract

Read online

Emotion recognition based on electroencephalography (EEG) has become an important topic in affective computing. However, current research does not utilize the relationship between channel frequency bands, and the trainable parameters and FLOPs of the models of mainstream methods from the past two years are very large. To solve this problem, this paper proposes a new emotion recognition framework to learn the frequency band relationship of EEG and the temporal-spatial representation of the frequency domain. First, we extract the differential entropy features of the frequency band at a frequency resolution of 2 Hz to retain rich frequency band information. Second, we design a temporal convolution and 3D convolutional neural network (TC3DNet) that is a fusion of Temporal Convolutional Network (TCN) and 3D convolutional neural network (3DCNN) in series. TCN is used to learn frequency band relationships, and 3DCNN is further used to learn temporal-spatial representation. The average recognition accuracy of TC3DNet in the benchmark EEG emotion recognition SEED dataset and SEED-IV dataset is 98.48% and 95.30%, which is better than current state-of-the-art methods. The impact of different frequency band divisions on the proposed framework is also explored. Furthermore, this study visualizes the extracted features, thereby enhancing the interpretability of our work.

Keywords