IEEE Access (Jan 2021)
Implementing Practical DNN-Based Object Detection Offloading Decision for Maximizing Detection Performance of Mobile Edge Devices
Abstract
In the last decade, deep neural network (DNN)-based object detection technologies have received significant attention as a promising solution to implement a variety of image understanding and video analysis applications on mobile edge devices. However, the execution of computationally intensive DNN-based object detection workloads in mobile edge devices is insufficient in fulfilling the object detection requirements with high accuracy and low latency, owing to the limited computation capacity. In this paper, we implement and evaluate a DNN-based object detection offloading framework to improve the object detection performance of mobile edge devices by offloading computation-intensive workloads to a remote edge server. However, preliminary experimental results have shown that offloading all object detection workloads of mobile edge devices may lead to worse performance than executing the workloads locally. This degradation is obtained from the inefficient resource utilization in the edge computing architectures, both for the edge server and mobile edge devices. To resolve the aforementioned problem with degradation, we devise a device-aware DNN offloading decision algorithm that is aimed to maximize resource utilization in the edge computing architecture. The proposed algorithm decides whether or not to offload the object detection workloads of edge devices by considering their computing power and network bandwidth, and therefore maximizing their average object detection processing frames per second. Through various experiments conducted in a real-life wireless local area network (WLAN) environment, we verified the effectiveness of the proposed DNN-based object detection offloading framework.
Keywords