Entropy (Oct 2021)

PAC-Bayes Unleashed: Generalisation Bounds with Unbounded Losses

  • Maxime Haddouche,
  • Benjamin Guedj,
  • Omar Rivasplata,
  • John Shawe-Taylor

DOI
https://doi.org/10.3390/e23101330
Journal volume & issue
Vol. 23, no. 10
p. 1330

Abstract

Read online

We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss functions. This extends the relevance and applicability of the PAC-Bayes learning framework, where most of the existing literature focuses on supervised learning problems with a bounded loss function (typically assumed to take values in the interval [0;1]). In order to relax this classical assumption, we propose to allow the range of the loss to depend on each predictor. This relaxation is captured by our new notion of HYPothesis-dependent rangE (HYPE). Based on this, we derive a novel PAC-Bayesian generalisation bound for unbounded loss functions, and we instantiate it on a linear regression problem. To make our theory usable by the largest audience possible, we include discussions on actual computation, practicality and limitations of our assumptions.

Keywords