IEEE Access (Jan 2024)

Exploring the Impact of Partial Occlusion on Emotion Classification From Facial Expressions: A Comparative Study of XR Headsets and Face Masks

  • Alberto Casas-Ortiz,
  • Jon Echeverria,
  • Nerea Jimenez-Tellez,
  • Olga C. Santos

DOI
https://doi.org/10.1109/ACCESS.2024.3380439
Journal volume & issue
Vol. 12
pp. 44613 – 44627

Abstract

Read online

This study provides of a comparative analysis of emotion estimation from facial expressions under partial occlusion caused by face masks and extended reality (XR) headsets. Unlike previous studies that have independently explored these two scenarios, this research compares and analyzes the statistical differences between them. In order to achieve this, the RAF-DB dataset has been used as a non-occluded baseline to construct two new datasets: i) a dataset formed by faces partially occluded by face masks, and ii) a dataset formed by faces partially occluded by XR headsets. To evaluate the impact of occlusion in emotion estimation, three deep learning models have been fine-tuned using transfer learning, and results from a random classifier have been used as a baseline. Seven different metrics were obtained per dataset, and a 2-way ANOVA test was performed on each metric. As expected, significant statistical differences are observed between the non-occluded faces (acc. 0.8780) and the faces partially occluded by face masks (acc. 0.7520) and XR headsets (acc. 0.7400) on all metrics. Notably, the comparison between the two partially occluded datasets revealed significant statistical differences in the metrics f1-score (macro), precision (macro) and recall (macro), which we attribute to different types of occlusion affecting different parts of the face that are key to some emotions. This research contributes to advancing emotion recognition systems by highlighting their robustness and effectiveness even in partial occlusion settings, and showing a full comparative analysis between two common types of occlusion.

Keywords