Sensors (Jul 2024)

Optimizing Data Flow in Binary Neural Networks

  • Lorenzo Vorabbi,
  • Davide Maltoni,
  • Stefano Santi

DOI
https://doi.org/10.3390/s24154780
Journal volume & issue
Vol. 24, no. 15
p. 4780

Abstract

Read online

Binary neural networks (BNNs) can substantially accelerate a neural network’s inference time by substituting its costly floating-point arithmetic with bit-wise operations. Nevertheless, state-of-the-art approaches reduce the efficiency of the data flow in the BNN layers by introducing intermediate conversions from 1 to 16/32 bits. We propose a novel training scheme, denoted as BNN-Clip, that can increase the parallelism and data flow of the BNN pipeline; specifically, we introduce a clipping block that reduces the data width from 32 bits to 8. Furthermore, we decrease the internal accumulator size of a binary layer, usually kept using 32 bits to prevent data overflow, with no accuracy loss. Moreover, we propose an optimization of the batch normalization layer that reduces latency and simplifies deployment. Finally, we present an optimized implementation of the binary direct convolution for ARM NEON instruction sets. Our experiments show a consistent inference latency speed-up (up to 1.3 and 2.4× compared to two state-of-the-art BNN frameworks) while reaching an accuracy comparable with state-of-the-art approaches on datasets like CIFAR-10, SVHN, and ImageNet.

Keywords