Machine Learning: Science and Technology (Jan 2023)

Error scaling laws for kernel classification under source and capacity conditions

  • Hugo Cui,
  • Bruno Loureiro,
  • Florent Krzakala,
  • Lenka Zdeborová

DOI
https://doi.org/10.1088/2632-2153/acf041
Journal volume & issue
Vol. 4, no. 3
p. 035033

Abstract

Read online

In this manuscript we consider the problem of kernel classification. While worst-case bounds on the decay rate of the prediction error with the number of samples are known for some classifiers, they often fail to accurately describe the learning curves of real data sets. In this work, we consider the important class of data sets satisfying the standard source and capacity conditions, comprising a number of real data sets as we show numerically. Under the Gaussian design, we derive the decay rates for the misclassification (prediction) error as a function of the source and capacity coefficients. We do so for two standard kernel classification settings, namely margin-maximizing support vector machines and ridge classification, and contrast the two methods. We find that our rates tightly describe the learning curves for this class of data sets, and are also observed on real data. Our results can also be seen as an explicit prediction of the exponents of a scaling law for kernel classification that is accurate on some real datasets.

Keywords