Scientific Reports (Jun 2021)

FNS allows efficient event-driven spiking neural network simulations based on a neuron model supporting spike latency

  • Gianluca Susi,
  • Pilar Garcés,
  • Emanuele Paracone,
  • Alessandro Cristini,
  • Mario Salerno,
  • Fernando Maestú,
  • Ernesto Pereda

DOI
https://doi.org/10.1038/s41598-021-91513-8
Journal volume & issue
Vol. 11, no. 1
pp. 1 – 17

Abstract

Read online

Abstract Neural modelling tools are increasingly employed to describe, explain, and predict the human brain’s behavior. Among them, spiking neural networks (SNNs) make possible the simulation of neural activity at the level of single neurons, but their use is often threatened by the resources needed in terms of processing capabilities and memory. Emerging applications where a low energy burden is required (e.g. implanted neuroprostheses) motivate the exploration of new strategies able to capture the relevant principles of neuronal dynamics in reduced and efficient models. The recent Leaky Integrate-and-Fire with Latency (LIFL) spiking neuron model shows some realistic neuronal features and efficiency at the same time, a combination of characteristics that may result appealing for SNN-based brain modelling. In this paper we introduce FNS, the first LIFL-based SNN framework, which combines spiking/synaptic modelling with the event-driven approach, allowing us to define heterogeneous neuron groups and multi-scale connectivity, with delayed connections and plastic synapses. FNS allows multi-thread, precise simulations, integrating a novel parallelization strategy and a mechanism of periodic dumping. We evaluate the performance of FNS in terms of simulation time and used memory, and compare it with those obtained with neuronal models having a similar neurocomputational profile, implemented in NEST, showing that FNS performs better in both scenarios. FNS can be advantageously used to explore the interaction within and between populations of spiking neurons, even for long time-scales and with a limited hardware configuration.