IEEE Access (Jan 2022)

Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware

  • Lluis Mas Ruiz,
  • Pere Pinol Pueyo,
  • Jordi Mateo-Fornes,
  • Jordi Vilaplana Mayoral,
  • Francesc Solsona Tehas

DOI
https://doi.org/10.1109/ACCESS.2022.3158743
Journal volume & issue
Vol. 10
pp. 33083 – 33094

Abstract

Read online

Cloud systems and microservices are becoming powerful tools for businesses. The evidence of the advantages of offering infrastructure, hardware or software as a service (IaaS, PaaS, SaaS) is overwhelming. Microservices and decoupled applications are increasingly popular. These architectures, based on containers, have facilitated the efficient development of complex SaaS applications. A big challenge is to manage and design microservices with a massive range of different facilities, from processing and data storage to computing predictive and prescriptive analytics. Computing providers are mainly based on data centers formed of massive and heterogeneous virtualized systems, which are continuously growing and diversifying over time. Moreover, these systems require integrating into current systems while meeting the Quality of Service (QoS) constraints. The primary purpose of this work is to present an on-premise architecture based on Kubernetes and Docker containers aimed at improving QoS regarding resource usage and service level objectives (SLOs). The main contribution of this proposal is its dynamic autoscaling capabilities to adjust system resources to the current workload while improving QoS.

Keywords