Entropy (Jan 2010)

Maximum Entropy Approaches to Living Neural Networks

  • John M. Beggs,
  • Alan Litke,
  • Alexander Sher,
  • Wladyslaw Dabrowski,
  • Pawel Hottowy,
  • Jon P. Hobbs,
  • Fang-Chin Yeh,
  • Aonan Tang

DOI
https://doi.org/10.3390/e12010089
Journal volume & issue
Vol. 12, no. 1
pp. 89 – 106

Abstract

Read online

Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the key developments driving this interest is a new class of models based on the principle of maximum entropy. Maximum entropy models have been reported to account for spatial correlation structure in ensembles of neurons recorded from several different types of data. Importantly, these models require only information about the firing rates of individual neurons and their pairwise correlations. If this approach is generally applicable, it would drastically simplify the problem of understanding how neural networks behave. Given the interest in this method, several groups now have worked to extend maximum entropy models to account for temporal correlations. Here, we review how maximum entropy models have been applied to neuronal ensemble data to account for spatial and temporal correlations. We also discuss criticisms of the maximum entropy approach that argue that it is not generally applicable to larger ensembles of neurons. We conclude that future maximum entropy models will need to address three issues: temporal correlations, higher-order correlations, and larger ensemble sizes. Finally, we provide a brief list of topics for future research.

Keywords