Journal of Cloud Computing: Advances, Systems and Applications (Sep 2024)

Dependency-aware online task offloading based on deep reinforcement learning for IoV

  • Chunhong Liu,
  • Huaichen Wang,
  • Mengdi Zhao,
  • Jialei Liu,
  • Xiaoyan Zhao,
  • Peiyan Yuan

DOI
https://doi.org/10.1186/s13677-024-00701-0
Journal volume & issue
Vol. 13, no. 1
pp. 1 – 17

Abstract

Read online

Abstract The convergence of artificial intelligence and in-vehicle wireless communication technologies, promises to fulfill the pressing communication needs of the Internet of Vehicles (IoV) while promoting the development of vehicle applications. However, making real-time dependency-aware task offloading decisions is difficult due to the high mobility of vehicles and the dynamic nature of the network environment. This leads to additional application computation time and energy consumption, increasing the risk of offloading failures for computationally intensive and latency-sensitive applications. In this paper, an offloading strategy for vehicle applications that jointly considers latency and energy consumption in the base station cooperative computing model is proposed. Firstly, we establish a collaborative offloading model involving multiple vehicles, multiple base stations, and multiple edge servers. Transferring vehicular applications to the application queue of edge servers and prioritizing them based on their completion deadlines. Secondly, each vehicular application is modeled as a directed acyclic graph (DAG) task with data dependency relationships. Subsequently, we propose a task offloading method based on task dependency awareness in deep reinforcement learning (DAG-DQN). Tasks are assigned to edge servers at different base stations, and edge servers collaborate to process tasks, minimizing vehicle application completion time and reducing edge server energy consumption. Finally, simulation results show that compared with the heuristic method, our proposed DAG-DQN method reduces task completion time by 16%, reduces system energy consumption by 19%, and improves decision-making efficiency by 70%.

Keywords