IEEE Access (Jan 2024)

An Efficient IoT-Fog-Cloud Resource Allocation Framework Based on Two-Stage Approach

  • Ismail Zahraddeen Yakubu,
  • M. Murali

DOI
https://doi.org/10.1109/ACCESS.2024.3405581
Journal volume & issue
Vol. 12
pp. 75384 – 75395

Abstract

Read online

With the advent of the Internet of Things (IoT) paradigm and the prolific growth in technology, the volume of data generated by intelligent devices has increased tremendously. Cloud computing provides unlimited processing and storage capabilities to process and store the generated data. However, the cloud computing paradigm is associated with high transmission latency, high energy consumption, and a lack of location awareness. On the other hand, the data generated by the intelligent devices is delay-sensitive and needs to be processed on the fly. Thus, cloud computing isn’t suitable for the execution of this delay-sensitive data. To curtail the issues associated with the cloud paradigm, the fog paradigm, which allows data to be processed at the proximity of IoT devices, was introduced. One common feature of the fog paradigm is its limitations in capabilities, which make it unsuitable for processing large volumes of data. To ensure the smooth execution of delay-sensitive application tasks and the large volume of data generated, there is a need for the fog paradigm to collaborate with the cloud paradigm to achieve a common goal. In this paper, an efficient resource allocation framework is proposed to efficiently and effectively utilise the fog and cloud resources for executing delay-sensitive tasks and the huge volume of data generated by end users. The allocation of resources to tasks is done in two stages. Firstly, the tasks in the arrival queue are classified based on the task guarantee ratio on the cloud and fog layers and allocated to suitable resources in the layers of their respective classes. Secondly, we apply Bayes’ classifier to previous allocation history data to classify newly arrived tasks and allocate suitable resources to the tasks for execution in the layers of their respective classes. A Crayfish Optimization Algorithm (COA) is used to generate an optimal resource allocation in both the fog and cloud layers that reduces the delay and execution time of the system. The proposed method is implemented using the iFogSim simulator toolkit, and the execution results prove more promising in comparison with the state-of-the-art methods.

Keywords