Applied Sciences (Jan 2022)

Comparative Analysis of Emotion Classification Based on Facial Expression and Physiological Signals Using Deep Learning

  • SeungJun Oh,
  • Dong-Keun Kim

DOI
https://doi.org/10.3390/app12031286
Journal volume & issue
Vol. 12, no. 3
p. 1286

Abstract

Read online

This study aimed to classify emotion based on facial expression and physiological signals using deep learning and to compare the analyzed results. We asked 53 subjects to make facial expressions, expressing four types of emotion. Next, the emotion-inducing video was watched for 1 min, and the physiological signals were obtained. We defined four emotions as positive and negative emotions and designed three types of deep-learning models that can classify emotions. Each model used facial expressions and physiological signals as inputs, and a model in which these two types of input were applied simultaneously was also constructed. The accuracy of the model was 81.54% when physiological signals were used, 99.9% when facial expressions were used, and 86.2% when both were used. Constructing a deep-learning model with only facial expressions showed good performance. The results of this study confirm that the best approach for classifying emotion is using only facial expressions rather than data from multiple inputs. However, this is an opinion presented only in terms of accuracy without considering the computational cost, and it is suggested that physiological signals and multiple inputs be used according to the situation and research purpose.

Keywords