Frontiers in Neuroscience (Jul 2020)

On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices

  • Dongseok Kwon,
  • Suhwan Lim,
  • Jong-Ho Bae,
  • Sung-Tae Lee,
  • Hyeongsu Kim,
  • Young-Tak Seo,
  • Seongbin Oh,
  • Jangsaeng Kim,
  • Kyuho Yeom,
  • Byung-Gook Park,
  • Jong-Ho Lee

DOI
https://doi.org/10.3389/fnins.2020.00423
Journal volume & issue
Vol. 14

Abstract

Read online

Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing system with very low power consumption and massively parallel operation. To train SNNs with supervision, we propose an efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of the proposed scheme for SNNs is close to that of conventional artificial neural networks (ANNs) by using the stochastic characteristics of neurons. In a hardware configuration, gated Schottky diodes (GSDs) are used as synaptic devices, which have a saturated current with respect to the input voltage. We design the SNN system by using the proposed on-chip training scheme with the GSDs, which can update their conductance in parallel to speed up the overall system. The performance of the on-chip training SNN system is validated through MNIST data set classification based on network size and total time step. The SNN systems achieve accuracy of 97.83% with 1 hidden layer and 98.44% with 4 hidden layers in fully connected neural networks. We then evaluate the effect of non-linearity and asymmetry of conductance response for long-term potentiation (LTP) and long-term depression (LTD) on the performance of the on-chip training SNN system. In addition, the impact of device variations on the performance of the on-chip training SNN system is evaluated.

Keywords