IEEE Access (Jan 2024)
An Energy and Temperature Aware Deep Reinforcement Learning Workflow Scheduler in Cloud Computing
Abstract
Workflow Scheduling is a crucial challenge in cloud computing as different task dependencies involved in workflows which makes task scheduling process more complex and incurs huge energy consumption for Cloud Service Provider (CSP). Inefficient scheduling of workflows onto VMs without considering runtime capacity of tasks leads to increase in consumption of energy, makespan and operational costs for CSP. It is also important to consider the cooling costs for CSP as and when the temperature increases at the datacenter then there is a definite overhead on CSP to invest in cooling costs which in turn leads to increase of costs for both CSP and cloud users. Therefore, to mitigate these operational costs and to reduce energy consumption in cloud computing an energy-temperature aware workflow scheduler based on deep reinforcement learning technique DQN is proposed which calculates priorities of tasks based on their computational capacity and calculates priorities of datacenters based on temperature. After calculation of task, datacenter priorities these are given as input to scheduler induced with DQN model to schedule tasks according to the evaluated priorities. Proposed energy- temperature aware workflow scheduler based on Deep Reinforcement learning (ETAWSDRL) is implemented on workflowsim. Proposed ETAWSDRL is evaluated using scientific workflows(Epigenomics, LIGO, Cybershake, Montage). It is evaluated using existing approaches FCFS, PSO, ACO. Results of proposed ETAWSDRL shown significant improvement over compared approaches while minimizing makespan, utilization of resources, energy consumption, scheduling overhead by 86.7%, 78.32%, 87.1%, 36.2% respectively.
Keywords