PLoS Computational Biology (Nov 2008)

Optimal learning rules for discrete synapses.

  • Adam B Barrett,
  • M C W van Rossum

DOI
https://doi.org/10.1371/journal.pcbi.1000230
Journal volume & issue
Vol. 4, no. 11
p. e1000230

Abstract

Read online

There is evidence that biological synapses have a limited number of discrete weight states. Memory storage with such synapses behaves quite differently from synapses with unbounded, continuous weights, as old memories are automatically overwritten by new memories. Consequently, there has been substantial discussion about how this affects learning and storage capacity. In this paper, we calculate the storage capacity of discrete, bounded synapses in terms of Shannon information. We use this to optimize the learning rules and investigate how the maximum information capacity depends on the number of synapses, the number of synaptic states, and the coding sparseness. Below a certain critical number of synapses per neuron (comparable to numbers found in biology), we find that storage is similar to unbounded, continuous synapses. Hence, discrete synapses do not necessarily have lower storage capacity.