Advances in Difference Equations (Jan 2021)

Convergence analysis of gradient-based iterative algorithms for a class of rectangular Sylvester matrix equations based on Banach contraction principle

  • Adisorn Kittisopaporn,
  • Pattrawut Chansangiam,
  • Wicharn Lewkeeratiyutkul

DOI
https://doi.org/10.1186/s13662-020-03185-9
Journal volume & issue
Vol. 2021, no. 1
pp. 1 – 17

Abstract

Read online

Abstract We derive an iterative procedure for solving a generalized Sylvester matrix equation A X B + C X D = E $AXB+CXD = E$ , where A , B , C , D , E $A,B,C,D,E$ are conforming rectangular matrices. Our algorithm is based on gradients and hierarchical identification principle. We convert the matrix iteration process to a first-order linear difference vector equation with matrix coefficient. The Banach contraction principle reveals that the sequence of approximated solutions converges to the exact solution for any initial matrix if and only if the convergence factor belongs to an open interval. The contraction principle also gives the convergence rate and the error analysis, governed by the spectral radius of the associated iteration matrix. We obtain the fastest convergence factor so that the spectral radius of the iteration matrix is minimized. In particular, we obtain iterative algorithms for the matrix equation A X B = C $AXB=C$ , the Sylvester equation, and the Kalman–Yakubovich equation. We give numerical experiments of the proposed algorithm to illustrate its applicability, effectiveness, and efficiency.

Keywords