IEEE Access (Jan 2024)

Advancing Emotional Health Assessments: A Hybrid Deep Learning Approach Using Physiological Signals for Robust Emotion Recognition

  • Amna Waheed Awan,
  • Imran Taj,
  • Shehzad Khalid,
  • Syed Muhammad Usman,
  • Ali Shariq Imran,
  • Muhammad Usman Akram

DOI
https://doi.org/10.1109/ACCESS.2024.3463746
Journal volume & issue
Vol. 12
pp. 141890 – 141904

Abstract

Read online

Emotional health significantly impacts physical and psychological well-being, with emotional imbalances and cognitive disorders leading to various health issues. Timely diagnosis of mental illnesses is crucial for preventing severe disorders and enhancing medical care quality. Physiological signals, such as Electrocardiograms (ECG) and Electroencephalograms (EEG), which reflect cardiac and neuronal activities, are reliable for emotion recognition as they are less susceptible to manipulation than physical signals. Galvanic Skin Response (GSR) is also closely linked to emotional states. Researchers have developed various methods for classifying signals to detect emotions. However, these signals are susceptible to noise and are inherently non-stationary, meaning they change constantly over time. Consequently, emotions can vary rapidly. Traditional techniques for analyzing physiological signals may not be adequate to study the dynamic changes in emotional states. This research introduces a deep learning approach using a combination of advanced signal processing and machine learning to analyze physiological signals for emotion recognition. We propose a CNN-Vision Transformer (CVT) based method with ensemble classification. The process involves decomposing signals into segments, removing noise, and extracting features using 1D CNN and Vision Transformers. These features are integrated into a single vector for classification by an ensemble of LSTM, ELM, and SVM classifiers. The outputs are then synthesized using Model Agnostic Meta Learning (MAML) to improve prediction accuracy. Validated on AMIGOS and DEAP datasets with 10-fold cross-validation, our method achieved accuracies up to 98.2%, sensitivity of 99.15%, and specificity of 99.53%, outperforming existing emotion charting techniques. This novel method provides significant improvements 3 to 4% in the accuracy, sensitivity, and specificity of emotion detection, leveraging physiological signals for comprehensive emotional assessments.

Keywords