Electronics Letters (Jun 2021)

Normalized stochastic gradient descent learning of general complex‐valued models

  • T. Paireder,
  • C. Motz,
  • M. Huemer

DOI
https://doi.org/10.1049/ell2.12170
Journal volume & issue
Vol. 57, no. 12
pp. 493 – 495

Abstract

Read online

Abstract The stochastic gradient descent (SGD) method is one of the most prominent first‐order iterative optimisation algorithms, enabling linear adaptive filters as well as general nonlinear learning schemes. It is applicable to a wide range of objective functions, while featuring low computational costs for online operation. However, without a suitable step‐size normalisation, the convergence and tracking behaviour of the stochastic gradient descent method might be degraded in practical applications. In this letter, a novel general normalisation approach is provided for the learning of (non‐)holomorphic models with multiple independent parameter sets. The advantages of the proposed method are demonstrated by means of a specific widely‐linear estimation example.

Keywords