Frontiers in Computational Neuroscience (May 2021)

Layer-Skipping Connections Improve the Effectiveness of Equilibrium Propagation on Layered Networks

  • Jimmy Gammell,
  • Jimmy Gammell,
  • Sonia Buckley,
  • Sae Woo Nam,
  • Adam N. McCaughan

DOI
https://doi.org/10.3389/fncom.2021.627357
Journal volume & issue
Vol. 15

Abstract

Read online

Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and could be implemented efficiently in neuromorphic hardware. Previous applications of this framework to layered networks encountered a vanishing gradient problem that has not yet been solved in a simple, biologically-plausible way. In this paper, we demonstrate that the vanishing gradient problem can be mitigated by replacing some of a layered network's connections with random layer-skipping connections in a manner inspired by small-world networks. This approach would be convenient to implement in neuromorphic hardware, and is biologically-plausible.

Keywords