Physical Review Research (Apr 2021)

Learning curves for overparametrized deep neural networks: A field theory perspective

  • Omry Cohen,
  • Or Malka,
  • Zohar Ringel

DOI
https://doi.org/10.1103/PhysRevResearch.3.023034
Journal volume & issue
Vol. 3, no. 2
p. 023034

Abstract

Read online Read online

In the past decade, deep neural networks (DNNs) came to the fore as the leading machine-learning algorithms for a variety of tasks. Their rise was founded on market needs and engineering craftsmanship, the latter based more on trial and error than on theory. While still far behind the application forefront, the theoretical study of DNNs has recently made important advancements in analyzing the highly overparametrized regime where some exact results have been obtained. Leveraging these ideas and adopting a more physicslike approach, here we construct a versatile field theory formalism for supervised deep learning, involving renormalization group, Feynman diagrams, and replicas. In particular, we show that our approach leads to highly accurate predictions of learning curves of truly deep DNNs trained on polynomial regression problems. It also explains in a concrete manner why DNNs generalize well despite being highly overparametrized, this due to an entropic bias to simple functions which, for the case of fully connected DNNs with data sampled on the hypersphere, are low-order polynomials in the input vector. Being a complex interacting system of artificial neurons, we believe that such tools and methodologies borrowed from condensed matter physics would prove essential for obtaining an accurate quantitative understanding of deep learning.