IEEE Access (Jan 2018)

Analysis of the Variance Reduction in SVRG and a New Acceleration Method

  • Erxue Min,
  • Jun Long,
  • Jianjing Cui

DOI
https://doi.org/10.1109/ACCESS.2018.2814212
Journal volume & issue
Vol. 6
pp. 16165 – 16175

Abstract

Read online

Stochastic gradient descent is a popular method in large-scale optimization for machine learning but suffers from a slow convergence. In recent years, stochastic variance reduced gradient (SVRG) is proposed to remedy this problem. Although many variants of SVRG have been studied, the analysis of variance has not been thoroughly discussed. In this paper, we propose a general framework denoted by epoch-update-indentification (EUI), which is an abstraction of the existing variants of SVRG. Under this framework i.e., EUI, we then provide a general analysis of the variance reduction technique from a new perspective. Additionally, those previous variants of SVRG have to keep a snapshot of the full gradient for each epoch, which is computationally expensive. In this paper, we also propose a new variant of SVRG named sampleVR which estimates the snapshot of the full gradient by using a sampling strategy, thus leading to decrease the gradient complexity significantly. Both the theoretical analysis and extensive empirical studies show that sampleVR achieves a good tradeoff between convergence performance and gradient complexity, and thus makes the training loss converge faster than its counterparts.

Keywords