IEEE Access (Jan 2022)

Deep Learning-Based Approach for Continuous Affect Prediction From Facial Expression Images in Valence-Arousal Space

  • Stephen Khor Wen Hwooi,
  • Alice Othmani,
  • Aznul Qalid Md. Sabri

DOI
https://doi.org/10.1109/ACCESS.2022.3205018
Journal volume & issue
Vol. 10
pp. 96053 – 96065

Abstract

Read online

Facial emotion recognition has attracted extensive attention from the affective computing community and several approaches have been proposed, mainly providing classification of facial expressions images using a set of discrete emotional labels. The quantification of human emotions from faces has been studied in a continuous 2D emotional space of valence and arousal, that describes the level of pleasantness and the intensity of excitement, respectively. Emotion assessment using valence-arousal computation is a challenging topic with several possible applications in health monitoring, e-learning, mental health diagnosis and monitoring of customer interest. Supervised learning of emotional valence-arousal for continuous affect prediction requires labeled data. However, annotation of facial images with the values for valence and arousal requires training human experts. In this paper, we propose a new and robust approach based on deep learning for continuous affect recognition and prediction. The novelty of our approach is that it maps the discrete labels and a learned facial expression representation to the continuous valence-arousal dimensional space. Given a discrete class of emotion and a facial image, our deep learning-based approach can predict the valence and arousal values accurately. Our proposed approach outperforms existing approaches for arousal and valence prediction on AffectNet dataset and it shows an impressive generalization ability on an unseen dataset for valence prediction.

Keywords