Axioms (Sep 2024)

Cohen–Grossberg Neural Network Delay Models with Fractional Derivatives with Respect to Another Function—Theoretical Bounds of the Solutions

  • Ravi Agarwal,
  • Snezhana Hristova,
  • Donal O’Regan

DOI
https://doi.org/10.3390/axioms13090605
Journal volume & issue
Vol. 13, no. 9
p. 605

Abstract

Read online

The Cohen–Grossberg neural network is studied in the case when the dynamics of the neurons is modeled by a Riemann–Liouville fractional derivative with respect to another function and an appropriate initial condition is set up. Some inequalities about both the quadratic function and the absolute values functions and their fractional derivatives with respect to another function are proved and they are based on an appropriate modification of the Razumikhin method. These inequalities are applied to obtain the bounds of the norms of any solution of the model. In particular, we apply the squared norm and the absolute values norms. These bounds depend significantly on the function applied in the fractional derivative. We study the asymptotic behavior of the solutions of the model. In the case when the function applied in the fractional derivative is increasing without any bound, the norms of the solution of the model approach zero. In the case when the applied function in the fractional derivative is equal to the current time, the studied problem reduces to the model with the classical Riemann–Liouville fractional derivative and the obtained results gives us sufficient conditions for asymptotic behavior of the solutions for the corresponding model. In the case when the function applied in the fractional derivative is bounded, we obtain a finite bound for the solutions of the model. This bound depends on the initial function and the solution does not approach zero. An example is given illustrating the theoretical results.

Keywords