Scientific Reports (Dec 2023)

Sparse-firing regularization methods for spiking neural networks with time-to-first-spike coding

  • Yusuke Sakemi,
  • Kakei Yamamoto,
  • Takeo Hosomi,
  • Kazuyuki Aihara

DOI
https://doi.org/10.1038/s41598-023-50201-5
Journal volume & issue
Vol. 13, no. 1
pp. 1 – 12

Abstract

Read online

Abstract The training of multilayer spiking neural networks (SNNs) using the error backpropagation algorithm has made significant progress in recent years. Among the various training schemes, the error backpropagation method that directly uses the firing time of neurons has attracted considerable attention because it can realize ideal temporal coding. This method uses time-to-first-spike (TTFS) coding, in which each neuron fires at most once, and this restriction on the number of firings enables information to be processed at a very low firing frequency. This low firing frequency increases the energy efficiency of information processing in SNNs. However, only an upper limit has been provided for TTFS-coded SNNs, and the information-processing capability of SNNs at lower firing frequencies has not been fully investigated. In this paper, we propose two spike-timing-based sparse-firing (SSR) regularization methods to further reduce the firing frequency of TTFS-coded SNNs. Both methods are characterized by the fact that they only require information about the firing timing and associated weights. The effects of these regularization methods were investigated on the MNIST, Fashion-MNIST, and CIFAR-10 datasets using multilayer perceptron networks and convolutional neural network structures.