Jisuanji kexue yu tansuo (Dec 2023)

Class-Balanced Modulation for Facial Expression Recognition

  • LIU Chengguang, WANG Shanmin, LIU Qingshan

DOI
https://doi.org/10.3778/j.issn.1673-9418.2210079
Journal volume & issue
Vol. 17, no. 12
pp. 3029 – 3038

Abstract

Read online

Facial expression recognition (FER) aims at determining the types of facial expressions for given facial images, which has a broad application prospect in psychological diagnosis, human-computer interaction, etc. In practical tasks, various databases tend to have imbalanced data distributions among basic facial expressions. Such an issue has caused imbalanced feature distribution and inconsistent classifier optimization for various facial expressions, seriously affecting the performance of expression recognition models. To solve this issue, this paper proposes a class-balanced modulation mechanism for facial expression recognition (CBM-Net), which attempts to address the imbalanced data distribution problem by modulating the FER model in feature learning and classifier optimization stages. CBM-Net includes two modules of feature modulation and gradient modulation. The feature modulation module struggles to balance feature distributions for all facial expressions by increasing the separability between classes and the tightness within classes in the feature direction. The gradient modulation module uses the statistical information of batch training samples to reversely adjust the optimization gradient of each classifier to ensure that the convergence speed of each classifier is consistent, so that the performance of each classifier can be optimal at the same time. Qualitative and quantitative experiments on four popular datasets show that CBM-Net is effective in class-balanced modulation, and its effect is quite good compared with many advanced methods.

Keywords