IEEE Access (Jan 2020)

Filter Pruning Without Damaging Networks Capacity

  • Yuding Zuo,
  • Bo Chen,
  • Te Shi,
  • Mengfan Sun

DOI
https://doi.org/10.1109/ACCESS.2020.2993932
Journal volume & issue
Vol. 8
pp. 90924 – 90930

Abstract

Read online

Due to its over-parameterized design, the deep convolutional neural networks lead to a huge amount of parameters and high computational cost, making it difficult to deploy on some devices with limited computational resources in reality. In this paper, we propose a method of filter pruning without damaging networks capacity to accelerate and compress deep convolutional neural networks. Differently from some existing filter pruning methods, we pay more attention to the damage by filter pruning to model capacity. In order to restore the original model capacity, we generate new feature maps on the basis of the remaining feature maps with lighter structure after pruning the redundant filters that are similar with the others. Experimental results on CIFAR10 and CIFAR100 benchmarks demonstrate the effectiveness of our method. Especially, our method reduces more than 49% FLOPs for VGGNet-16 on CIFAR10 with only 0.07% relative accuracy drop. The relative accuracy has even been increased by 0.13% with reducing more than 24% FLOPs. Moreover, our method accelerates ResNet-110 on CIFAR10 by 22.1% with 0.41% accuracy improvement, which exceeds the previous methods.

Keywords