Future Internet (Feb 2022)

JoSDW: Combating Noisy Labels by Dynamic Weight

  • Yaojie Zhang,
  • Huahu Xu,
  • Junsheng Xiao,
  • Minjie Bian

DOI
https://doi.org/10.3390/fi14020050
Journal volume & issue
Vol. 14, no. 2
p. 50

Abstract

Read online

The real world is full of noisy labels that lead neural networks to perform poorly because deep neural networks (DNNs) are prone to overfitting label noise. Noise label training is a challenging problem relating to weakly supervised learning. The most advanced existing methods mainly adopt a small loss sample selection strategy, such as selecting the small loss part of the sample for network model training. However, the previous literature stopped here, neglecting the performance of the small loss sample selection strategy while training the DNNs, as well as the performance of different stages, and the performance of the collaborative learning of the two networks from disagreement to an agreement, and making a second classification based on this. We train the network using a comparative learning method. Specifically, a small loss sample selection strategy with dynamic weight is designed. This strategy increases the proportion of agreement based on network predictions, gradually reduces the weight of the complex sample, and increases the weight of the pure sample at the same time. A large number of experiments verify the superiority of our method.

Keywords