IEEE Access (Jan 2024)

Generalized Exponentiated Gradient Algorithms and Their Application to On-Line Portfolio Selection

  • Andrzej Cichocki,
  • Sergio Cruces,
  • Auxiliadora Sarmiento,
  • Toshihisa Tanaka

DOI
https://doi.org/10.1109/ACCESS.2024.3520389
Journal volume & issue
Vol. 12
pp. 197000 – 197020

Abstract

Read online

Stochastic gradient descent (SGD) and exponentiated gradient (EG) update methods are widely used in signal processing and machine learning. This study introduces a novel family of generalized Exponentiated Gradient updates (EGAB) derived from the alpha-beta (AB) divergence regularization. The EGAB framework provides enhanced flexibility for processing data with varying distributions, thanks to the tunable hyperparameters of the AB divergence. We explore the applicability of these updates in online portfolio selection (OLPS) for financial markets with the goal of developing algorithms that achieve high risk-adjusted returns, even under relatively high transaction costs. The proposed EGAB algorithms are developed using constrained gradient optimization with regularization terms, demonstrating their versatility in OLPS by unifying the directional search of various algorithms and enabling interpolation between them. Our analysis and extensive computer simulations reveal that EGAB updates outperform existing OLPS algorithms, delivering good results on several performance metrics, such as cumulative return, average excess return, Sharpe ratio, and Calmar ratio, especially when transaction costs are significant. In conclusion, this study introduces a new family of exponentiated gradient updates and demonstrates their flexibility and effectiveness through extensive simulations across a wide range of real-world financial datasets.

Keywords