IEEE Open Journal of the Communications Society (Jan 2024)

Deep Reinforcement Learning for Energy-Efficient Data Dissemination Through UAV Networks

  • Abubakar S. Ali,
  • Ahmed A. Al-Habob,
  • Shimaa Naser,
  • Lina Bariah,
  • Octavia A. Dobre,
  • Sami Muhaidat

DOI
https://doi.org/10.1109/OJCOMS.2024.3398718
Journal volume & issue
Vol. 5
pp. 5567 – 5583

Abstract

Read online

The rise of the Internet of Things (IoT), marked by unprecedented growth in connected devices, has created an insatiable demand for supplementary computational and communication resources. The integration of Unmanned aerial vehicles (UAVs) within IoT ecosystems presents a promising avenue to surmount these obstacles, offering enhanced network coverage, agile deployment capabilities, and efficient data gathering from geographically challenging locales. UAVs have been recognized as a compelling solution, offering extended coverage, flexibility, and reachability for IoT networks. Despite these benefits, UAV technology faces significant challenges, including limited energy resources, the necessity for adaptive responses to dynamic environments, and the imperative for autonomous operation to fulfill the evolving demands of IoT networks. In light of this, we introduce an innovative UAV-assisted data dissemination framework that aims to minimize the total energy expenditure, considering both the UAV and all spatially-distributed IoT devices. Our framework addresses three interconnected subproblems: device classification, device association, and path planning. For device classification, we employ two distinct types of deep reinforcement learning (DRL) agents—Double Deep Q-Network (DDQN) and Proximal Policy Optimization (PPO)—to classify devices into two tiers. To tackle device association, we propose an approach based on the nearest-neighbor heuristic to associate Tier 2 devices with a Tier 1 device. For path planning, we propose an approach that utilizes the Lin-Kernighan heuristic to plan the UAV’s path among the Tier 1 devices. We compare our method with three baseline approaches and demonstrate through simulation results that our approach significantly reduces energy consumption and offers a near-optimal solution in a fraction of the time required by brute force methods and ant colony heuristics. Consequently, our framework presents an efficient and practical alternative for energy-efficient data dissemination in UAV-assisted IoT networks.

Keywords