Discrete Dynamics in Nature and Society (Jan 2019)

Discrete-Time Zhang Neural Networks for Time-Varying Nonlinear Optimization

  • Min Sun,
  • Maoying Tian,
  • Yiju Wang

DOI
https://doi.org/10.1155/2019/4745759
Journal volume & issue
Vol. 2019

Abstract

Read online

As a special kind of recurrent neural networks, Zhang neural network (ZNN) has been successfully applied to various time-variant problems solving. In this paper, we present three Zhang et al. discretization (ZeaD) formulas, including a special two-step ZeaD formula, a general two-step ZeaD formula, and a general five-step ZeaD formula, and prove that the special and general two-step ZeaD formulas are convergent while the general five-step ZeaD formula is not zero-stable and thus is divergent. Then, to solve the time-varying nonlinear optimization (TVNO) in real time, based on the Taylor series expansion and the above two convergent two-step ZeaD formulas, we discrete the continuous-time ZNN (CTZNN) model of TVNO and thus get a special two-step discrete-time ZNN (DTZNN) model and a general two-step DTZNN model. Theoretical analyses indicate that the sequence generated by the first DTZNN model is divergent, while the sequence generated by the second DTZNN model is convergent. Furthermore, for the step-size of the second DTZNN model, its tight upper bound and the optimal step-size are also discussed. Finally, some numerical results and comparisons are provided and analyzed to substantiate the efficacy of the proposed DTZNN models.