Network Neuroscience (May 2019)

Optimal modularity and memory capacity of neural reservoirs

  • Nathaniel Rodriguez,
  • Eduardo Izquierdo,
  • Yong-Yeol Ahn

DOI
https://doi.org/10.1162/netn_a_00082
Journal volume & issue
Vol. 3, no. 2
pp. 551 – 566

Abstract

Read online

The neural network is a powerful computing framework that has been exploited by biological evolution and by humans for solving diverse problems. Although the computational capabilities of neural networks are determined by their structure, the current understanding of the relationships between a neural network’s architecture and function is still primitive. Here we reveal that a neural network’s modular architecture plays a vital role in determining the neural dynamics and memory performance of the network of threshold neurons. In particular, we demonstrate that there exists an optimal modularity for memory performance, where a balance between local cohesion and global connectivity is established, allowing optimally modular networks to remember longer. Our results suggest that insights from dynamical analysis of neural networks and information-spreading processes can be leveraged to better design neural networks and may shed light on the brain’s modular organization. Understanding the inner workings of the human brain is one of the greatest scientific challenges. It will not only advance the science of the human mind, but also help us build more intelligent machines. In doing so, it is crucial to understand how the structural organization of the brain affects functional capabilities. Here we reveal a strong connection between the modularity of a neural network and its performance in memory tasks. Namely, we demonstrate that there is optimal modularity for memory performance. Our results suggest a design principle for artificial recurrent neural networks as well as a hypothesis that may explain not only the existence but also the strength of modularity in the brain.

Keywords