Computation (Aug 2023)

The Weights Reset Technique for Deep Neural Networks Implicit Regularization

  • Grigoriy Plusch,
  • Sergey Arsenyev-Obraztsov,
  • Olga Kochueva

DOI
https://doi.org/10.3390/computation11080148
Journal volume & issue
Vol. 11, no. 8
p. 148

Abstract

Read online

We present a new regularization method called Weights Reset, which includes periodically resetting a random portion of layer weights during the training process using predefined probability distributions. This technique was applied and tested on several popular classification datasets, Caltech-101, CIFAR-100 and Imagenette. We compare these results with other traditional regularization methods. The subsequent test results demonstrate that the Weights Reset method is competitive, achieving the best performance on Imagenette dataset and the challenging and unbalanced Caltech-101 dataset. This method also has sufficient potential to prevent vanishing and exploding gradients. However, this analysis is of a brief nature. Further comprehensive studies are needed in order to gain a deep understanding of the computing potential and limitations of the Weights Reset method. The observed results show that the Weights Reset method can be estimated as an effective extension of the traditional regularization methods and can help to improve model performance and generalization.

Keywords