Jordanian Journal of Computers and Information Technology (Apr 2024)
LOCAL FEATURE SELECTION USING THE WRAPPER APPROACH FOR FACIAL EXPRESSION RECOGNITION
Abstract
Automatic Facial Expression Recognition (FER) systems provide an important way to express and interpret the emotional and mental states of human beings. These FER systems transform the facial image into a set of features to train a classifier capable of distinguishing between different classes of emotions. However, the problem often posed is that the extracted feature vectors sometimes contain irrelevant or redundant features, which decreases the accuracy of the induced classifier and increases the computation time. To overcome this problem, dimensionality must be reduced by selecting only the most relevant features. In this paper, we study the impact of adding the "Wrapper" selection approach and using the information provided by different local regions of the face such as the mouth, eyes and eyebrows, on the performance of a traditional FER system based on a local geometric feature extraction method. The objective here is to test and analyze how this combination can improve the overall performance of the original traditional system. The obtained results, based on the Multimedia Understanding Group (MUG) database, showed that the FER system combined with the proposed feature selection strategy gives better classification results than to the original system for all four classification models, namely, K-Nearest Neighbor (KNN) classifier, Tree classifier, NB classifier and Linear Discriminant Analysis (LDA). Indeed, a considerable reduction (up to 50%) in the number of features used and an accuracy of 100%, using the LDA classifier, were observed, which represents a significant improvement in terms of computation time, efficiency and memory space. Furthermore, the majority of relevant features used are part of the "eyebrows region", which proves the importance of using information from local regions of the face in emotion recognition tasks. [JJCIT 2024; 10(4.000): 367-382]
Keywords