Intelligent Systems with Applications (Mar 2024)
Detection of human emotions through facial expressions using hybrid convolutional neural network-recurrent neural network algorithm
Abstract
Cognitive science plays a pivotal role in deciphering human behavior by understanding and interpreting emotions prevalent in everyday life. These emotions manifest through various cues, including speech patterns, body language, and notably, facial expressions. Human facial expressions serve as a fundamental mode of communication and interaction. Within the realm of computer vision, Facial Expression Recognition (FER) stands as a crucial field, offering diverse techniques to decode emotions from facial expressions. This research aims to develop a hybrid Convolutional Neural Network – Recurrent Neural Network (CNN-RNN) model adept at detecting human emotions derived from facial expressions based on video data. The models are developed based on Emotional Wearable Dataset 2020. This dataset consists of several expressions, four of them - amusement, enthusiasm, awe, and liking - has never been explored in the previous datasets. This expansion provides a more comprehensive approach to emotion detection. Three models – MobileNetV2-RNN, InceptionV3-RNN, and custom CNN-RNN – are developed for the classification. The custom CNN-RNN model achieved an accuracy rate of 63 %, while the MobileNetV2-RNN and InceptionV3-RNN transfer learning models yield 59 % and 66 %, respectively. The developed models demonstrate enhanced efficiency in distinguishing these nuanced emotions, a significant advancement in the field of facial expression recognition. This research holds substantial implications for cognitive science and real-world applications, particularly in enhancing interactive digital communication and emotional analysis.