Frontiers in Neuroscience (Aug 2022)

A high throughput generative vector autoregression model for stochastic synapses

  • Tyler Hennen,
  • Alexander Elias,
  • Jean-François Nodin,
  • Gabriel Molas,
  • Gabriel Molas,
  • Rainer Waser,
  • Dirk J. Wouters,
  • Daniel Bedau

DOI
https://doi.org/10.3389/fnins.2022.941753
Journal volume & issue
Vol. 16

Abstract

Read online

By imitating the synaptic connectivity and plasticity of the brain, emerging electronic nanodevices offer new opportunities as the building blocks of neuromorphic systems. One challenge for large-scale simulations of computational architectures based on emerging devices is to accurately capture device response, hysteresis, noise, and the covariance structure in the temporal domain as well as between the different device parameters. We address this challenge with a high throughput generative model for synaptic arrays that is based on a recently available type of electrical measurement data for resistive memory cells. We map this real-world data onto a vector autoregressive stochastic process to accurately reproduce the device parameters and their cross-correlation structure. While closely matching the measured data, our model is still very fast; we provide parallelized implementations for both CPUs and GPUs and demonstrate array sizes above one billion cells and throughputs exceeding one hundred million weight updates per second, above the pixel rate of a 30 frames/s 4K video stream.

Keywords