IEEE Access (Jan 2020)

Optimal Distribution of Spiking Neurons Over Multicore Neuromorphic Processors

  • Guhyun Kim,
  • Vladimir Kornijcuk,
  • Jeeson Kim,
  • Cheol Seong Hwang,
  • Doo Seok Jeong

DOI
https://doi.org/10.1109/ACCESS.2020.2986490
Journal volume & issue
Vol. 8
pp. 69426 – 69437

Abstract

Read online

In a multicore neuromorphic processor embedding a learning algorithm, a presynaptic neuron is occasionally located in a different core from the cores of its postsynaptic neurons, which needs neuron-to-target core communication for inference through a network router. The more neuron-to-target core connections, the more workload is imposed on the network router, which the more likely causes event routing congestion. Another significant challenge arising from a large number of neuron-to-core connections is data duplication in multiple cores for the learning algorithm to access the full data to evaluate weight update. This data duplication consumes a considerable amount of on-chip memory while the memory capacity per core is strictly limited. The optimal distribution of neurons over cores is categorized as an optimization problem with constraints, which may allow the discrete Lagrangian multiplier method (LMM) to optimize the distribution. Proof-of-concept demonstrations were made on the distribution of neurons over cores in a neuromorphic processor embedding a learning algorithm. The choice of the learning algorithm was twofold: a simple spike timing-dependent plasticity learning rule and event-driven random backpropagation algorithm, which are categorized as a two- and three-factor learning rule, respectively. As a result, the discrete LMM significantly reduced the number of neuron-to-core connections for both algorithms by approximately 55% in comparison with the number for random distribution cases, implying a 55% reduction in the workload on the network router and a 52.8% reduction in data duplication. The code is available on-line (https://github.com/guhyunkim/Optimize-neuron-distribution).

Keywords