IEEE Access (Jan 2022)
The Effect of Batch Normalization on Noise Resistant Property of Deep Learning Models
Abstract
The fast execution speed and energy efficiency of analog hardware have made them a strong contender for deploying deep learning models at the edge. However, there are concerns about the presence of analog noise which causes changes to the models’ weight, leading to performance degradation of deep learning models, despite their inherent noise-resistant characteristics. The effect of the popular batch normalization layer (BatchNorm) on the noise-resistant ability of deep learning models is investigated in this work. This systematic study has been carried out by first training different models with and without the BatchNorm layer on the CIFAR10 and the CIFAR100 datasets. The weights of the resulting models are then injected with analog noise, and the performance of the models on the test dataset is obtained and compared. The results show that the presence of the BatchNorm layer negatively impacts the noise-resistant property of deep learning models, i.e., ResNet44 and VGG16 models with BatchNorm layers trained with the CIFAR10 dataset have an average normalized inference accuracy of 41.32% and 10.75% respectively compared to 91.95% and 93.80% obtained for same ResNet44 and VGG16 model without the BatchNorm layer respectively. Furthermore, the impact of the BatchNorm layer also grows with the increase of the number of BatchNorm layers.
Keywords