IEEE Access (Jan 2019)

BitFlow-Net: Toward Fully Binarized Convolutional Neural Networks

  • Lijun Wu,
  • Peiqing Jiang,
  • Zhicong Chen,
  • Xu Lin,
  • Yunfeng Lai,
  • Peijie Lin,
  • Shuying Cheng

DOI
https://doi.org/10.1109/ACCESS.2019.2945488
Journal volume & issue
Vol. 7
pp. 154617 – 154626

Abstract

Read online

Binarization can greatly compress and accelerate deep convolutional neural networks (CNNs) for real-time industrial applications. However, existing binarized CNNs (BCNNs) rely on scaling factor (SF) and batch normalization (BatchNorm) that still involve resource-consuming floating-point multiplication operations. Addressing the limitation, an improved BCNN named BitFlow-Net is proposed, which replaces floating-point operations with integer addition in middle layers. First, it is derived that the SF is only effective in back-propagation process, whereas it is counteracted by BatchNorm in inference process. Then, in model running phase, the SF and BatchNorm are fused into an integer addition, named BatchShift. Consequently, the data flow in middle layers is fully binarized during modeling running phase. To verify its potential in industrial applications with multiclass and binary classification tasks, the BitFlow-Net is built based on AlexNet and verified on two large image datasets, i.e., ImageNet and 11K Hands. Experimental results show that the BitFlow-Net can remove all floating-point operations in middle layers of BCNNs and greatly reduce the memory for both cases without affecting the accuracy. Particularly, the BitFlow-Net can achieve the accuracy comparable to that of the full-precision AlexNet network in the binary classification task.

Keywords