IEEE Access (Jan 2023)

Sequential Minimal Optimization Algorithm for One-Class Support Vector Machines With Privileged Information

  • Andrey Lange,
  • Dmitry Smolyakov,
  • Evgeny Burnaev

DOI
https://doi.org/10.1109/ACCESS.2023.3331685
Journal volume & issue
Vol. 11
pp. 128106 – 128124

Abstract

Read online

One of the powerful techniques in data modeling is accounting for features that are available at the training stage, but are not available when the trained model is used to classify or predict test data — Learning Using Privileged Information paradigm (LUPI, Vapnik and Vashist). Sequential Minimal Optimization (SMO) method has been already developed for supervised Support Vector Machines (SVM) in Platt and Keerthi et al., for unsupervised (one-class) SVM in Schölkopf et al., and for SVM with privileged information (SVM+) in Pechyony and Vapnik. As can be seen, the missing brick in this research has long been a one-class SVM with privileged information (OC-SVM+). In this paper, we propose SMO algorithm for OC-SVM+ that significantly outperforms non-sequential algorithms for training the OC-SVM+ model. Its finite-time convergence is established. The experiments show how privileged information affects a descriptive domain in the space of original features. Comparative benchmark tests demonstrate that our algorithm is superior over interior point algorithms.

Keywords