Journal of Cloud Computing: Advances, Systems and Applications (Jul 2023)

Latency and resource consumption analysis for serverless edge analytics

  • Rafael Moreno-Vozmediano,
  • Eduardo Huedo,
  • Rubén S. Montero,
  • Ignacio M. Llorente

DOI
https://doi.org/10.1186/s13677-023-00485-9
Journal volume & issue
Vol. 12, no. 1
pp. 1 – 22

Abstract

Read online

Abstract The serverless computing model, implemented by Function as a Service (FaaS) platforms, can offer several advantages for the deployment of data analytics solutions in IoT environments, such as agile and on-demand resource provisioning, automatic scaling, high elasticity, infrastructure management abstraction, and a fine-grained cost model. However, in the case of applications with strict latency requirements, the cold start problem in FaaS platforms can represent an important drawback. The most common techniques to alleviate this problem, mainly based on instance pre-warming and instance reusing mechanisms, are usually not well adapted to different application profiles and, in general, can entail an extra expense of resources. In this work, we analyze the effect of instance pre-warming and instance reusing on both application latency (response time) and resource consumption, for a typical data analytics use case (a machine learning application for image classification) with different input data patterns. Furthermore, we propose extending the classical centralized cloud-based serverless FaaS platform to a two-tier distributed edge-cloud platform to bring the platform closer to the data source and reduce network latencies.

Keywords