Symmetry (Apr 2020)

Analysis of Recurrent Neural Network and Predictions

  • Jieun Park,
  • Dokkyun Yi,
  • Sangmin Ji

DOI
https://doi.org/10.3390/sym12040615
Journal volume & issue
Vol. 12, no. 4
p. 615

Abstract

Read online

This paper analyzes the operation principle and predicted value of the recurrent-neural-network (RNN) structure, which is the most basic and suitable for the change of time in the structure of a neural network for various types of artificial intelligence (AI). In particular, an RNN in which all connections are symmetric guarantees that it will converge. The operating principle of a RNN is based on linear data combinations and is composed through the synthesis of nonlinear activation functions. Linear combined data are similar to the autoregressive-moving average (ARMA) method of statistical processing. However, distortion due to the nonlinear activation function in RNNs causes the predicted value to be different from the predicted ARMA value. Through this, we know the limit of the predicted value of an RNN and the range of prediction that changes according to the learning data. In addition to mathematical proofs, numerical experiments confirmed our claims.

Keywords