Biomimetics (Aug 2023)

IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation

  • Xiongfei Fan,
  • Hong Zhang,
  • Yu Zhang

DOI
https://doi.org/10.3390/biomimetics8040375
Journal volume & issue
Vol. 8, no. 4
p. 375

Abstract

Read online

Spiking neural networks (SNNs) are widely recognized for their biomimetic and efficient computing features. They utilize spikes to encode and transmit information. Despite the many advantages of SNNs, they suffer from the problems of low accuracy and large inference latency, which are, respectively, caused by the direct training and conversion from artificial neural network (ANN) training methods. Aiming to address these limitations, we propose a novel training pipeline (called IDSNN) based on parameter initialization and knowledge distillation, using ANN as a parameter source and teacher. IDSNN maximizes the knowledge extracted from ANNs and achieves competitive top-1 accuracy for CIFAR10 (94.22%) and CIFAR100 (75.41%) with low latency. More importantly, it can achieve 14× faster convergence speed than directly training SNNs under limited training resources, which demonstrates its practical value in applications.

Keywords