Jisuanji kexue yu tansuo (Nov 2022)

Dynamically Consistent and Confident Deep Semi-supervised Learning

  • LI Yong, GAO Can, LIU Zirong, LUO Jintao

DOI
https://doi.org/10.3778/j.issn.1673-9418.2104121
Journal volume & issue
Vol. 16, no. 11
pp. 2557 – 2564

Abstract

Read online

Deep semi-supervised learning methods based on consistency regularization and entropy minimization can effectively improve the performance of large-scale neural networks and reduce the need for labeled data. However, the regularization losses of the existing consistency regularization methods do not consider the differences between samples and the negative impact of mislabeled predictions, whereas the entropy minimization methods cannot flexibly adjust the prediction probability distribution. Firstly, to alleviate the negative impact of discrepancies between unlabeled samples and mislabeled predictions, a new consistency loss function named dynamically weighted consistency regularization (DWCR) is proposed, which can dynamically weigh the consistency loss of unlabeled samples. Then, to further adjust the prediction probability distribution, a new loss function called self-confidence promotion loss (SCPL) is proposed, which can flexibly adjust the strength of the model to generate low entropy prediction and achieve low-density separation between classes, thus improving classification performance. Finally, a deep semi-supervised learning method named dynamic consistency and confidence (DCC) is proposed by combining the DWCR, SPCL and supervised loss. Experiments on several datasets show that the proposed method achieves better classification performance than the state-of-the-art deep semi-supervised learning methods.

Keywords