Frontiers in Applied Mathematics and Statistics (Jun 2019)

Stochastic AUC Optimization Algorithms With Linear Convergence

  • Michael Natole,
  • Yiming Ying,
  • Siwei Lyu

DOI
https://doi.org/10.3389/fams.2019.00030
Journal volume & issue
Vol. 5

Abstract

Read online

Area under the ROC curve (AUC) is a standard metric that is used to measure classification performance for imbalanced class data. Developing stochastic learning algorithms that maximize AUC over accuracy is of practical interest. However, AUC maximization presents a challenge since the learning objective function is defined over a pair of instances of opposite classes. Existing methods circumvent this issue but with high space and time complexity. From our previous work of redefining AUC optimization as a convex-concave saddle point problem, we propose a new stochastic batch learning algorithm for AUC maximization. The key difference from our previous work is that we assume that the underlying distribution of the data is uniform, and we develop a batch learning algorithm that is a stochastic primal-dual algorithm (SPDAM) that achieves a linear convergence rate. We establish the theoretical convergence of SPDAM with high probability and demonstrate its effectiveness on standard benchmark datasets.

Keywords