IEEE Open Journal of the Communications Society (Jan 2024)
ComputiFi: Latency-Optimized Task Offloading in Multipath Multihop LiFi-WiFi Networks
Abstract
The increasing prevalence of latency-critical applications such as Ultra Reliable Low Latency Communications (URLLC), factory automation, and Artificial Intelligence (AI) for image classification demands efficient computational task offloading strategies to meet stringent latency requirements for users. Traditional Wireless-Fidelity (WiFi) networks often fall short due to limitations such as interference and limited bandwidth, necessitating alternative technologies like Light-Fidelity (LiFi), which offer higher data rates and reduced latency. Current single-path offloading approaches do not fully utilize network resources, leading to suboptimal performance. This paper introduces ComputiFi, a task offloading framework that utilizes a multipath, multihop LiFi-WiFi network architecture to minimize latency effectively. By dynamically deciding the offloading destination, splitting data among available technologies, and managing resources across multiple computational entities, ComputiFi addresses the complexities of heterogeneous networks. By capitalizing on multipath transmissions and various potential destinations for offloading, ComputiFi offers a robust solution for scenarios requiring low latency. Employing a variety of optimization tools, including Mixed Integer Nonlinear Programming (MINLP) solvers, meta-heuristics, Deep Reinforcement Learning (DRL), and black-box optimization techniques, ComputiFi processes tasks optimally. Performance evaluations show that ComputiFi consistently reduces user average latency up to 69.3%. Furthermore, for real-time implementation, it offers a DRL solution with prediction time in the order of milliseconds, offering 40.23% improvement in latency over baseline methods.
Keywords