Axioms (Jun 2024)

Optimal and Efficient Approximations of Gradients of Functions with Nonindependent Variables

  • Matieyendou Lamboni

DOI
https://doi.org/10.3390/axioms13070426
Journal volume & issue
Vol. 13, no. 7
p. 426

Abstract

Read online

Gradients of smooth functions with nonindependent variables are relevant for exploring complex models and for the optimization of the functions subjected to constraints. In this paper, we investigate new and simple approximations and computations of such gradients by making use of independent, central, and symmetric variables. Such approximations are well suited for applications in which the computations of the gradients are too expansive or impossible. The derived upper bounds of the biases of our approximations do not suffer from the curse of dimensionality for any 2-smooth function, and they theoretically improve the known results. Also, our estimators of such gradients reach the optimal (mean squared error) rates of convergence (i.e., O(N−1)) for the same class of functions. Numerical comparisons based on a test case and a high-dimensional PDE model show the efficiency of our approach.

Keywords