Axioms (Jan 2024)

Application of Gradient Optimization Methods in Defining Neural Dynamics

  • Predrag S. Stanimirović,
  • Nataša Tešić,
  • Dimitrios Gerontitis,
  • Gradimir V. Milovanović,
  • Milena J. Petrović,
  • Vladimir L. Kazakovtsev,
  • Vladislav Stasiuk

DOI
https://doi.org/10.3390/axioms13010049
Journal volume & issue
Vol. 13, no. 1
p. 49

Abstract

Read online

Applications of gradient method for nonlinear optimization in development of Gradient Neural Network (GNN) and Zhang Neural Network (ZNN) are investigated. Particularly, the solution of the matrix equation AXB=D which changes over time is studied using the novel GNN model, termed as GGNN(A,B,D). The GGNN model is developed applying GNN dynamics on the gradient of the error matrix used in the development of the GNN model. The convergence analysis shows that the neural state matrix of the GGNN(A,B,D) design converges asymptotically to the solution of the matrix equation AXB=D, for any initial state matrix. It is also shown that the convergence result is the least square solution which is defined depending on the selected initial matrix. A hybridization of GGNN with analogous modification GZNN of the ZNN dynamics is considered. The Simulink implementation of presented GGNN models is carried out on the set of real matrices.

Keywords