IEEE Access (Jan 2020)
Pruning Filters Base on Extending Filter Group Lasso
Abstract
Deep Convolution Neural Networks (CNNs) have been widely used in image recognition, while models of CNNs are desired to be more compact as the growing demands arise from various kinds of AI applications. As sparsity is generally accepted as inherent characteristics of the pruned models, L1 based approaches normally add penalty on convolution filters or channels to objectives structurally but straightforwardly. That kind of induced sparse models only represent the balanced results of training loss and penalty mechanically. In this paper, we construct the Extending Filter Group (EFG) by thorough investigations on underlying constraints between every two successive layers. The penalty in terms of EFG addresses train process on filters of the current layer and channels in the following layer, which called as synchronous reinforcement. Thus it provides an alternative way to induce a model with ideal sparsity, especially in case of complex datasets. Moreover, we present Noise Filter Recognition Mechanism (NFRM) to improve model accuracy, as a shrinking model is to be obtained from the original based our contributions. Our method achieves the accuracy of 72.67% in cifar-100 dataset, while the performance is 72.04% in current baseline. It is worth noticing that the pruning rate achieved through our method marks 43.8%, which is higher than the one generated by other popular pruning methods.
Keywords