Physical Review Research (Jul 2024)

Weight fluctuations in deep linear neural networks and a derivation of the inverse-variance flatness relation

  • Markus Gross,
  • Arne P. Raulf,
  • Christoph Räth

DOI
https://doi.org/10.1103/PhysRevResearch.6.033103
Journal volume & issue
Vol. 6, no. 3
p. 033103

Abstract

Read online Read online

We investigate the stationary (late-time) training regime of single- and two-layer underparameterized linear neural networks within the continuum limit of stochastic gradient descent (SGD) for synthetic Gaussian data. In the case of a single-layer network in the weakly underparameterized regime, the spectrum of the noise covariance matrix deviates notably from the Hessian, which can be attributed to the broken detailed balance of SGD dynamics. The weight fluctuations, are in this case, generally anisotropic but effectively experience an isotropic loss. For an underparameterized two-layer network, we describe the stochastic dynamics of the weights in each layer and analyze the associated stationary covariances. We identify the interlayer coupling as a distinct source of anisotropy for the weight fluctuations. In contrast to the single-layer case, the weight fluctuations are effectively subject to an anisotropic loss, the flatness of which is inversely related to the fluctuation variance. We thereby provide an analytical derivation of the recently observed inverse-variance flatness relation in a model of a deep linear neural network.