PLoS ONE (Jan 2017)

Distributed optimization of multi-class SVMs.

  • Maximilian Alber,
  • Julian Zimmert,
  • Urun Dogan,
  • Marius Kloft

DOI
https://doi.org/10.1371/journal.pone.0178161
Journal volume & issue
Vol. 12, no. 6
p. e0178161

Abstract

Read online

Training of one-vs.-rest SVMs can be parallelized over the number of classes in a straight forward way. Given enough computational resources, one-vs.-rest SVMs can thus be trained on data involving a large number of classes. The same cannot be stated, however, for the so-called all-in-one SVMs, which require solving a quadratic program of size quadratically in the number of classes. We develop distributed algorithms for two all-in-one SVM formulations (Lee et al. and Weston and Watkins) that parallelize the computation evenly over the number of classes. This allows us to compare these models to one-vs.-rest SVMs on unprecedented scale. The results indicate superior accuracy on text classification data.