Digital Communications and Networks (Feb 2019)

Enabling intelligence in fog computing to achieve energy and latency reduction

  • Quang Duy La,
  • Mao V. Ngo,
  • Thinh Quang Dinh,
  • Tony Q.S. Quek,
  • Hyundong Shin

Journal volume & issue
Vol. 5, no. 1
pp. 3 – 9

Abstract

Read online

Fog computing is an emerging architecture intended for alleviating the network burdens at the cloud and the core network by moving resource-intensive functionalities such as computation, communication, storage, and analytics closer to the End Users (EUs). In order to address the issues of energy efficiency and latency requirements for the time-critical Internet-of-Things (IoT) applications, fog computing systems could apply intelligence features in their operations to take advantage of the readily available data and computing resources. In this paper, we propose an approach that involves device-driven and human-driven intelligence as key enablers to reduce energy consumption and latency in fog computing via two case studies. The first one makes use of the machine learning to detect user behaviors and perform adaptive low-latency Medium Access Control (MAC)-layer scheduling among sensor devices. In the second case study on task offloading, we design an algorithm for an intelligent EU device to select its offloading decision in the presence of multiple fog nodes nearby, at the same time, minimize its own energy and latency objectives. Our results show a huge but untapped potential of intelligence in tackling the challenges of fog computing. Keywords: Fog computing, Edge computing, Machine learning, MAC scheduling, Computational offloading, Energy efficiency