IEEE Access (Jan 2020)

Efficient Spiking Neural Networks With Logarithmic Temporal Coding

  • Ming Zhang,
  • Zonghua Gu,
  • Nenggan Zheng,
  • De Ma,
  • Gang Pan

DOI
https://doi.org/10.1109/ACCESS.2020.2994360
Journal volume & issue
Vol. 8
pp. 98156 – 98167

Abstract

Read online

A Spiking Neural Network (SNN) can be trained indirectly by first training an Artificial Neural Network (ANN) with the conventional backpropagation algorithm, then converting it into an equivalent SNN. To reduce the computational cost of the resulting SNN as measured by the number of spikes, we present Logarithmic Temporal Coding (LTC), where the number of spikes used to encode an activation grows logarithmically with the activation value; and the accompanying Exponentiate-and-Fire (EF) neuron model, which only involves efficient bit-shift and addition operations. Moreover, we improve the training process of ANN to compensate for approximation errors due to LTC. Experimental results indicate that the resulting SNN achieves competitive performance in terms of classification accuracy at significantly lower computational cost than related work.

Keywords