Nature Communications (Nov 2021)

Correspondence between neuroevolution and gradient descent

  • Stephen Whitelam,
  • Viktor Selin,
  • Sang-Won Park,
  • Isaac Tamblyn

DOI
https://doi.org/10.1038/s41467-021-26568-2
Journal volume & issue
Vol. 12, no. 1
pp. 1 – 10

Abstract

Read online

Gradient-based and non-gradient-based methods for training neural networks are usually considered to be fundamentally different. The authors derive, and illustrate numerically, an analytic equivalence between the dynamics of neural network training under conditioned stochastic mutations, and under gradient descent.