Sensors (Sep 2024)

Automatic Recognition of Multiple Emotional Classes from EEG Signals through the Use of Graph Theory and Convolutional Neural Networks

  • Fatemeh Mohajelin,
  • Sobhan Sheykhivand,
  • Abbas Shabani,
  • Morad Danishvar,
  • Sebelan Danishvar,
  • Lida Zare Lahijan

DOI
https://doi.org/10.3390/s24185883
Journal volume & issue
Vol. 24, no. 18
p. 5883

Abstract

Read online

Emotion is a complex state caused by the functioning of the human brain in relation to various events, for which there is no scientific definition. Emotion recognition is traditionally conducted by psychologists and experts based on facial expressions—the traditional way to recognize something limited and is associated with errors. This study presents a new automatic method using electroencephalogram (EEG) signals based on combining graph theory with convolutional networks for emotion recognition. In the proposed model, firstly, a comprehensive database based on musical stimuli is provided to induce two and three emotional classes, including positive, negative, and neutral emotions. Generative adversarial networks (GANs) are used to supplement the recorded data, which are then input into the suggested deep network for feature extraction and classification. The suggested deep network can extract the dynamic information from the EEG data in an optimal manner and has 4 GConv layers. The accuracy of the categorization for two classes and three classes, respectively, is 99% and 98%, according to the suggested strategy. The suggested model has been compared with recent research and algorithms and has provided promising results. The proposed method can be used to complete the brain-computer-interface (BCI) systems puzzle.

Keywords