EAI Endorsed Transactions on Energy Web (Jul 2024)

Revolutionizing Cloud Resource Allocation: Harnessing Layer-Optimized Long Short-Term Memory for Energy-Efficient Predictive Resource Management

  • Prathigadapa Sireesha,
  • Vishnu Priyan S,
  • M Govindarajan,
  • Sounder Rajan,
  • V Rajakumareswaran

DOI
https://doi.org/10.4108/ew.6505
Journal volume & issue
Vol. 11

Abstract

Read online

INTRODUCTION: This is the introductory text. Accurate data center resource projection will be challenging due to the dynamic and constantly changing workloads of multi-tenant co-hosted applications. Resource Management in the Cloud (RMC) becomes a significant research component. In the cloud's easy service option, users can choose to pay a fixed sum or based on the amount of time. OBJECTIVES: The main goal of this study is systematic method for estimating future cloud resource requirements based on historical consumption. Resource distribution to users, who require a variety of resources, is one of cloud computing main objective in this study. METHODS: This article suggests a Layer optimized based Long Short-Term Memory (LOLSTM) to estimate the resource requirements for upcoming time slots. This model also detects SLA violations when the QoS value exceeds the dynamic threshold value, and it then proposes the proper countermeasures based on the risk involved with the violation. RESULTS: Results indicate that in terms of training and validation the accuracy is 97.6%, 95.9% respectively, RMSE and MAD shows error rate 0.127 and 0.107, The proposed method has a minimal training and validation loss at epoch 100 are 0.6092 and 0.5828, respectively. So, the suggested technique performed better than the current techniques. CONCLUSION: In this work, the resource requirements for future time slots are predicted using LOLSTM technique. It regularizes the weights of the network and avoids overfitting. In addition, the proposed work also takes necessary actions if the SLA violation is recognized by the model. Overall, the proposed work in this study shows better performance compared to the existing methods.

Keywords