Al-Iraqia Journal for Scientific Engineering Research (Dec 2023)

AI Workload Allocation Methods for Edge-Cloud Computing: A Review

  • Sarah Ammar Rafea ,
  • Ammar Dawood Jasim

DOI
https://doi.org/10.58564/IJSER.2.4.2023.125
Journal volume & issue
Vol. 2, no. 4

Abstract

Read online

Edge computing is used with cloud computing as an extension to increase the performance of delay-sensitive applications such as autonomous vehicles, healthcare systems, video surveillance systems, ..etc. The fast increase in the Internet of Things (IoT) devices increases the amount of data transferred in the network. IoT devices are resource-constrained in terms of energy consumption and computation capability. Data processing near IoT devices enabled by edge devices. Hence reduces the transmission power of sending data to the cloud and causes delays due to the cloud being placed far away from the devices. Most real-time applications depend on artificial intelligence (AI) techniques, increasing the computations on IoT-edge devices. Conversely, if this AI workload is executed on the cloud, the delay increase causes degradation in application performance. How to decide where the computation is done in an IoT, edge and cloud network is an important issue. The purpose of optimizing the workload allocation decision is to increase the application performance in terms of Quality of Experience (QoE) and Quality of Service (QoS); hence, the major goal is to reduce the delay time while maintaining the accuracy of the AI systems. As presented in this review, many researchers focus on proposing a workload allocation decision based on AI techniques. In contrast, other research focuses on the AI workload, hence presenting a method for partitioning the AI model to increase the system's accuracy in the resource constraint devices (end device and edge server). Many other researches also used the AI model for resource allocation and provisioning between edge servers and the cloud. In this review, the integration between AI and edge–cloud environment is investigated, the AI workload allocation methods are presented and analyzed, a brief overview of the application of deep learning in edge-cloud computing is also presented, and many challenges that need to be addressed for the AI application are discussed. Many issues and challenges are also presented for optimizing the edge.

Keywords