Physical Review Research (Nov 2022)

Decomposing neural networks as mappings of correlation functions

  • Kirsten Fischer,
  • Alexandre René,
  • Christian Keup,
  • Moritz Layer,
  • David Dahmen,
  • Moritz Helias

DOI
https://doi.org/10.1103/PhysRevResearch.4.043143
Journal volume & issue
Vol. 4, no. 4
p. 043143

Abstract

Read online Read online

Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus nonrandom weights. To address this issue, we study the mapping between probability distributions implemented by a deep feed-forward network. We characterize this mapping as an iterated transformation of distributions, where the nonlinearity in each layer transfers information between different orders of correlation functions. This allows us to identify essential statistics in the data, as well as different information representations that can be used by neural networks. Applied to an XOR task and to MNIST, we show that correlations up to second order predominantly capture the information processing in the internal layers, while the input layer also extracts higher-order correlations from the data. This analysis provides a quantitative and explainable perspective on classification.