IEEE Access (Jan 2020)

HyperTube: A Framework for Population-Based Online Hyperparameter Optimization with Resource Constraints

  • Renlong Jie,
  • Junbin Gao,
  • Andrey Vasnev,
  • Minh-Ngoc Tran

DOI
https://doi.org/10.1109/ACCESS.2020.2986456
Journal volume & issue
Vol. 8
pp. 69038 – 69057

Abstract

Read online

Online data streaming has become one of the most common data forms in the modern world, which imposes a strong demand for developing hyperparameter optimization techniques for online learning algorithms. In fact, processing online streaming data can be considered as a constraint on the training process. Existing studies do not provide a clear framework for modeling appropriate constraints with hyperparameter optimization in this context. In this paper, we propose a framework, called HyperTube, based on a set of assumptions that clearly define the constraints and objective function for online hyperparameter optimization under limited computing resources. We also introduce a “micro-mini-batch training mechanism” to reuse online data mini-batches in an efficient manner. Numerical experiments compare the performances of different training settings under the constraints of HyperTube. The results on stationary data streams without concept drift indicate that training on incremental data samples with model selection efficiently uses computing power and gives satisfactory validation performance compared with training without model selection. Meanwhile, the results on data streams with significant concept drift indicate that parallel updating could lead to relatively good model performance. In both cases, with the best settings, HyperTube with the micro-mini-batch training mechanism significantly outperforms offline random search with the same amount of computational resources. These settings can be further improved by a modified genetic algorithm. Finally, we develop a systematic method for selecting suitable settings based on a set of criteria.

Keywords