IEEE Access (Jan 2021)

DivNet: Efficient Convolutional Neural Network via Multilevel Hierarchical Architecture Design

  • Bachir Kaddar,
  • Hadria Fizazi,
  • Miguel Hernandez-Cabronero,
  • Victor Sanchez,
  • Joan Serra-Sagrista

DOI
https://doi.org/10.1109/ACCESS.2021.3099952
Journal volume & issue
Vol. 9
pp. 105892 – 105901

Abstract

Read online

Designing small and efficient mobile neural networks is difficult because the challenge is to determine the architecture that achieves the best performance under a given limited computational scenario. Previous lightweight neural networks rely on a cell module that is repeated in all stacked layers across the network. These approaches do not permit layer diversity, which is critical for achieving strong performance. This paper presents an experimental study to develop an efficient mobile network using a hierarchical architecture. Our proposed mobile network, called Diversity Network (DivNet), has been shown to perform better than the basic architecture generally employed by the best high-efficiency models–with simply stacked layers–, regarding complexity cost and performance. A set of architectural design decisions are described that reduce the proposed model size while yielding a significant performance improvement. Our experiments on image classification show that compared to, respectively, MobileNetV2, SqueezeNet, and ShuffleNetV2, our proposal DivNet can improve accuracy by 2.09%, 0.76%, and 0.66% on the CIFAR100 dataset, and by 0.05%, 4.96%, and 1.13% on the CIFAR10 dataset. On more complex datasets, e.g., ImageNet, our proposal DivNet achieves 70.65% Top-1 accuracy and 90.23% Top-5 accuracy, still better than other small models like MobilNet, SqueezeNet, ShuffleNet.

Keywords