Human Behavior and Emerging Technologies (Jan 2024)

A Multimodal Low Complexity Neural Network Approach for Emotion Recognition

  • Adrian Rodriguez Aguiñaga,
  • Margarita Ramirez Ramirez,
  • Maria del Consuelo Salgado Soto,
  • Maria de los Angeles Quezada Cisnero

DOI
https://doi.org/10.1155/2024/5581443
Journal volume & issue
Vol. 2024

Abstract

Read online

This paper introduces a neural network-based model designed for classifying emotional states by leveraging multimodal physiological signals. The model utilizes data from the AMIGOS and SEED-V databases. The AMIGOS database integrates inputs from electroencephalogram (EEG), electrocardiogram (ECG), and galvanic skin response (GSR) to analyze emotional responses, while the SEED-V database continuously updates EEG signals. We implemented a sequential neural network architecture featuring two hidden layers, which underwent substantial hyperparameter tuning to achieve optimal performance. Our model’s effectiveness was tested through binary classification tasks focusing on arousal and valence, as well as a more complex four-class classification that delineates emotional quadrants for the emotional tags: happy, sad, neutral, and disgust. In these varied scenarios, the model consistently demonstrated accuracy levels ranging from 79% to 86% in the AMIGOS database and up to 97% in SEED-V. A notable aspect of our approach is the model’s ability to accurately recognize emotions without the need for extensive signal preprocessing, a common challenge in multimodal emotion analysis. This feature enhances the practical applicability of our model in real-world scenarios where rapid and efficient emotion recognition is essential.