IEEE Access (Jan 2023)

Recurrent Residual Networks Contain Stronger Lottery Tickets

  • Angel Lopez Garcia-Arias,
  • Yasuyuki Okoshi,
  • Masanori Hashimoto,
  • Masato Motomura,
  • Jaehoon Yu

DOI
https://doi.org/10.1109/ACCESS.2023.3245808
Journal volume & issue
Vol. 11
pp. 16588 – 16604

Abstract

Read online

Accurate neural networks can be found just by pruning a randomly initialized overparameterized model, leaving out the need for any weight optimization. The resulting subnetworks are small, sparse, and ternary, making excellent candidates for efficient hardware implementation. However, finding optimal connectivity patterns is an open challenge. Based on the evidence that residual networks may be approximating unrolled shallow recurrent neural networks, we conjecture that they contain better candidate subnetworks at inference time when explicitly transformed into recurrent architectures. This hypothesis is put to the test on image classification tasks, where we find subnetworks within the recurrent models that are more accurate and parameter-efficient than both the ones found within feedforward models and than the full models with learned weights. Furthermore, random recurrent subnetworks are tiny: under a simple compression scheme, ResNet-50 is compressed without a drastic loss in performance to $48.55\times $ less memory size, fitting in under 2 megabytes. Code available at: https://github.com/Lopez-Angel/hidden-fold-networks.

Keywords