PLoS ONE (Jan 2022)

Dynamic performance-Energy tradeoff consolidation with contention-aware resource provisioning in containerized clouds.

  • Rewer M Canosa-Reyes,
  • Andrei Tchernykh,
  • Jorge M Cortés-Mendoza,
  • Bernardo Pulido-Gaytan,
  • Raúl Rivera-Rodriguez,
  • Jose E Lozano-Rizk,
  • Eduardo R Concepción-Morales,
  • Harold Enrique Castro Barrera,
  • Carlos J Barrios-Hernandez,
  • Favio Medrano-Jaimes,
  • Arutyun Avetisyan,
  • Mikhail Babenko,
  • Alexander Yu Drozdov

DOI
https://doi.org/10.1371/journal.pone.0261856
Journal volume & issue
Vol. 17, no. 1
p. e0261856

Abstract

Read online

Containers have emerged as a more portable and efficient solution than virtual machines for cloud infrastructure providing both a flexible way to build and deploy applications. The quality of service, security, performance, energy consumption, among others, are essential aspects of their deployment, management, and orchestration. Inappropriate resource allocation can lead to resource contention, entailing reduced performance, poor energy efficiency, and other potentially damaging effects. In this paper, we present a set of online job allocation strategies to optimize quality of service, energy savings, and completion time, considering contention for shared on-chip resources. We consider the job allocation as the multilevel dynamic bin-packing problem that provides a lightweight runtime solution that minimizes contention and energy consumption while maximizing utilization. The proposed strategies are based on two and three levels of scheduling policies with container selection, capacity distribution, and contention-aware allocation. The energy model considers joint execution of applications of different types on shared resources generalized by the job concentration paradigm. We provide an experimental analysis of eighty-six scheduling heuristics with scientific workloads of memory and CPU-intensive jobs. The proposed techniques outperform classical solutions in terms of quality of service, energy savings, and completion time by 21.73-43.44%, 44.06-92.11%, and 16.38-24.17%, respectively, leading to a cost-efficient resource allocation for cloud infrastructures.