Sistemnì Doslìdženâ ta Informacìjnì Tehnologìï (Oct 2022)

Resource scheduling in edge computing IoT networks using hybrid deep learning algorithm

  • G. Vijayasekaran,
  • M. Duraipandian

DOI
https://doi.org/10.20535/SRIT.2308-8893.2022.3.06
Journal volume & issue
no. 3
pp. 86 – 101

Abstract

Read online

The proliferation of the Internet of Things (IoT) and wireless sensor networks enhances data communication. The demand for data communication rapidly increases, which calls the emerging edge computing paradigm. Edge computing plays a major role in IoT networks and provides computing resources close to the users. Moving the services from the cloud to users increases the communication, storage, and network features of the users. However, massive IoT networks require a large spectrum of resources for their computations. In order to attain this, resource scheduling algorithms are employed in edge computing. Statistical and machine learning-based resource scheduling algorithms have evolved in the past decade, but the performance can be improved if resource requirements are analyzed further. A deep learning-based resource scheduling in edge computing IoT networks is presented in this research work using deep bidirectional recurrent neural network (BRNN) and convolutional neural network algorithms. Before scheduling, the IoT users are categorized into clusters using a spectral clustering algorithm. The proposed model simulation analysis verifies the performance in terms of delay, response time, execution time, and resource utilization. Existing resource scheduling algorithms like a genetic algorithm (GA), Improved Particle Swarm Optimization (IPSO), and LSTM-based models are compared with the proposed model to validate the superior performances.

Keywords