Physical Review Research (Feb 2021)

Noise optimizes super-Turing computation in recurrent neural networks

  • Emmett Redd,
  • Steven Senger,
  • Tayo Obafemi-Ajayi

DOI
https://doi.org/10.1103/PhysRevResearch.3.013120
Journal volume & issue
Vol. 3, no. 1
p. 013120

Abstract

Read online Read online

This paper explores the benefit of added noise in increasing the computational complexity of digital recurrent neural networks (RNNs). The physically accepted model of the universe imposes rational number, stochastic limits on all calculations. An analog RNN with those limits calculates at the super-Turing complexity level BPP/log*. In this paper, we demonstrate how noise aids digital RNNs in attaining super-Turing operation similar to analog RNNs. We investigate moving limited-precision systems from not being chaotic at small amounts of noise, through consistency with chaos, to overwhelming it at large amounts of noise. A Kolmogorov-complexity-based proof shows that an infinite computational class hierarchy exists between P, the Turing class, and BPP/log*. The hierarchy offers a possibility that the noise-enhanced digital RNNs could operate at a super-Turing level less complex than BPP/log*. As the uniform noise increases, the digital RNNs develop positive Lyapunov exponents intimating that chaos is mimicked. The exponents maximize to the accepted values for the logistic and Hénon maps when the noise equals eight times the least significant bit of the noisy recurrent signals for the logistic digital RNN and four times the Hénon digital RNN.