Mathematics (Nov 2022)

Recurrent Neural Network Models Based on Optimization Methods

  • Predrag S. Stanimirović,
  • Spyridon D. Mourtas,
  • Vasilios N. Katsikis,
  • Lev A. Kazakovtsev,
  • Vladimir N. Krutikov

DOI
https://doi.org/10.3390/math10224292
Journal volume & issue
Vol. 10, no. 22
p. 4292

Abstract

Read online

Many researchers have addressed problems involving time-varying (TV) general linear matrix equations (GLMEs) because of their importance in science and engineering. This research discusses and solves the topic of solving TV GLME using the zeroing neural network (ZNN) design. Five new ZNN models based on novel error functions arising from gradient-descent and Newton optimization methods are presented and compared to each other and to the standard ZNN design. Pseudoinversion is involved in four proposed ZNN models, while three of them are related to Newton’s optimization method. Heterogeneous numerical examples show that all models successfully solve TV GLMEs, although their effectiveness varies and depends on the input matrix.

Keywords