IEEE Access (Jan 2024)
Decoding Human Facial Emotions: A Ranking Approach Using Explainable AI
Abstract
To decipher human activities and facilitate organic computer-human interactions, facial expression recognition is crucial using four datasets JAFFE, CKPlus, KDEF and AffectNet, we achieve excellent accuracy in human face emotion classification with the VGG 16 pre-trained model with transfer learning. To obtain a comprehensive insight of the model’s decision-making process, we employ Layerwise Relevance Propagation (LRP), a method from explainable Artificial Intelligence (XAI). Only positive relevance scores are taken into account for successfully predicted test images from the datasets. Contributory pixels towards predicting the intensity of emotion are pixels with good relevance ratings. By combining emotion recognition with LRP, we can forecast emotion labels and ranks. Using a confusion matrix, we checked if our predictions were in line with reality. Our model achieved intensity prediction accuracy of 96.33% on JAFFE, 95.78% on CKPlus, 95.78% on KDEF, and 93.89% on AffectNet. A group of ten annotators work together to generate ground truth by assigning ratings of “MINIMAL, “AVERAGE, and “STRONG to each image. This study demonstrates how well our method predicts the ranks of emotion intensity and provides information on how trustworthy and interpretable the model is. Facial emotion recognition’s intensity ranking is made more robust with the addition of XAI techniques like LRP.
Keywords