Open Journal of Mathematical Optimization (Jan 2023)

Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures

  • Chizat, Lénaïc

DOI
https://doi.org/10.5802/ojmo.20
Journal volume & issue
Vol. 3
pp. 1 – 19

Abstract

Read online

We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^{-q/(d+q)})$ for additive updates for some $q\in \lbrace 1,2,4\rbrace $ determined by the structure of the objective function. Our flexible proof strategy, based on approximation arguments, allows us to painlessly cover all Bregman Proximal Gradient Methods (PGM) and their acceleration (APGM) under various geometries such as the hyperbolic entropy and $L^p$ divergences. We also prove the tightness of our analysis with matching lower bounds and confirm the theoretical results with numerical experiments on low dimensional problems. Note that all these optimization methods must additionally pay the computational cost of discretization, which can be exponential in $d$.

Keywords