Frontiers in Public Health (Nov 2024)
Multi-physiological signal fusion for objective emotion recognition in educational human–computer interaction
Abstract
IntroductionAn increasing prevalence of psychological stress and emotional issues among higher education teachers necessitates innovative approaches to promote their wellbeing. Emotion recognition technology, integrated into educational human–computer interaction (HCI) systems, offers a promising solution. This study aimed to develop a robust emotion recognition system to enhance teacher–student interactions within educational HCI settings.MethodsA multi-physiological signal-based emotion recognition system was developed using wearable devices to capture electrocardiography (ECG), electromyography (EMG), electrodermal activity, and respiratory signals. Feature extraction was performed using time-domain and time-frequency domain analysis methods, followed by feature selection to eliminate redundant features. A convolutional neural network (CNN) with attention mechanisms was employed as the decision-making model.ResultsThe proposed system demonstrated superior accuracy in recognizing emotional states than existing methods. The attention mechanisms provided interpretability by highlighting the most informative physiological features for emotion classification.DiscussionThe developed system offers significant advancements in emotion recognition for educational HCI, enabling more accurate and standardized assessments of teacher emotional states. Real-time integration of this technology into educational environments can enhance teacher–student interactions and contribute to improved learning outcomes. Future research can explore the generalizability of this system to diverse populations and educational settings.
Keywords