Journal of Communications Software and Systems (Jun 2021)

A Deep Learning Approach for Real-Time Analysis of Attendees’ Engagement in Public Events

  • Sujith Samuel Mathew,
  • Manar AlKhatib,
  • May El Barachi

DOI
https://doi.org/10.24138/jcomss-2021-0072
Journal volume & issue
Vol. 17, no. 2
pp. 106 – 115

Abstract

Read online

Smart city analytics requires the harnessing and analysis of emotions and sentiments conveyed by images and video footage. In recent years, facial sentiment analysis attracted significant attention for different application areas, including marketing, gaming, political analytics, healthcare, and human computer interaction. Aiming at contributing to this area, we propose a deep learning model enabling the accurate emotion analysis of crowded scenes containing complete and partially occluded faces, with different angles, various distances from the camera, and varying resolutions. Our model consists of a sophisticated convolutional neural network (CNN) that is combined with pooling, densifying, flattening, and Softmax layers to achieve accurate sentiment and emotion analysis of facial images. The proposed model was successfully tested using 3,750 images containing 22,563 faces, collected from a large consumer electronics trade show. The model was able to correctly classify the test images which contained faces with different angles, distances, occlusion areas, facial orientation and resolutions. It achieved an average accuracy of 90.6% when distinguishing between seven emotions (Happiness, smiling, laughter, neutral, sadness, anger, and surprise) in complete faces, and 86.16% accuracy in partially occluded faces. Such model can be leveraged for the automatic analysis of attendees’ engagement level in events. Furthermore, it can open the door for many useful applications in smart cities, such as measuring employees’ satisfaction and citizens’ happiness.

Keywords