IEEE Access (Jan 2022)

Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation

  • Yan Zhou,
  • Ruyi Jiao,
  • Dongli Wang,
  • Jinzhen Mu,
  • Jianxun Li

DOI
https://doi.org/10.1109/ACCESS.2022.3172664
Journal volume & issue
Vol. 10
pp. 48855 – 48864

Abstract

Read online

Restricted by the cost of generating labels for training, semi-supervised methods have been applied to semantic segmentation tasks and have achieved varying degrees of success. Recently, the semi-supervised learning method has taken pseudo supervision as the core idea, especially self-training methods that generate pseudo labels. However, pseudo labels are noisy. In semi-supervised learning, as training progresses, the model needs to focus on more semantic classes and bias towards the newly learned classes. Moreover, due to the limitation of the amount of labeled data, it is difficult for the model to “stabilize” the learned knowledge. That raise the issue of the model forgetting previously learned knowledge. Based on this new view, we point out that alleviating “catastrophic forgetting” of the model is beneficial for enhancing the quality of pseudo labels, and propose a pseudo label enhancement strategy. In this strategy, the pseudo labels generated by the previous model are used to rehearse the previous knowledge. Additionally, conflict reduction is proposed to resolve the conflicts of pseudo labels generated from both the previous and current models. We evaluate our scheme on two general semi-supervised semantic segmentation benchmarks, and both achieve state-of-the-art performance. Our codes are released at https://github.com/wing212/DMT-PLE.

Keywords