PLoS Computational Biology (Jun 2016)

Learning Universal Computations with Spikes.

  • Dominik Thalmeier,
  • Marvin Uhlmann,
  • Hilbert J Kappen,
  • Raoul-Martin Memmesheimer

DOI
https://doi.org/10.1371/journal.pcbi.1004895
Journal volume & issue
Vol. 12, no. 6
p. e1004895

Abstract

Read online

Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them.