IEEE Access (Jan 2024)
DRL-Based Distributed Task Offloading Framework in Edge-Cloud Environment
Abstract
The Internet of Things (IoT) and real-time media streaming have increased due to the rapid development of wireless communication technologies and the enormous growth of computation and data transmission tasks. Edge-Cloud Computing (ECC) combines the benefits of Mobile Cloud Computing (MCC) and Mobile Edge Computing (MEC) to meet energy consumption and delay requirements, and achieve more stable and affordable task execution. The most significant challenge in ECC is making real-time task offloading decisions. In order to generate offloading decisions in ECC environments in an efficient and near optimal manner, a Deep Reinforcement Learning (DRL)-based Distributed task Offloading (DRL-DO) framework is proposed. The Keras ML library is used to implement and evaluate the proposed DRL-DO and other offloading algorithms in Python experiments. Experimental results demonstrate the accuracy of the DRL-DO framework; it achieves a high Gain Ratio (GR) of about 22.3% and greatly reduces the energy consumption, response time, and system utility by about 7.6%, 43%, and 26.2%, respectively, while attaining moderate time cost compared with other offloading algorithms.
Keywords