Nature Communications (Jul 2020)
A solution to the learning dilemma for recurrent networks of spiking neurons
Abstract
Bellec et al. present a mathematically founded approximation for gradient descent training of recurrent neural networks without backwards propagation in time. This enables biologically plausible training of spike-based neural network models with working memory and supports on-chip training of neuromorphic hardware.