Knowledge Engineering and Data Science (Jun 2021)

Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction

  • Purnawansyah Purnawansyah,
  • Haviluddin Haviluddin,
  • Herdianti Darwis,
  • Huzain Azis,
  • Yulita Salim

DOI
https://doi.org/10.17977/um018v4i12021p14-28
Journal volume & issue
Vol. 4, no. 1
pp. 14 – 28

Abstract

Read online

Predicting network traffic is crucial for preventing congestion and gaining superior quality of network services. This research aims to use backpropagation to predict the inbound level to understand and determine internet usage. The architecture consists of one input layer, two hidden layers, and one output layer. The study compares three activation functions: sigmoid, rectified linear unit (ReLU), and hyperbolic Tangent (tanh). Three learning rates: 0.1, 0.5, and 0.9 represent low, moderate, and high rates, respectively. Based on the result, in terms of a single form of activation function, although sigmoid provides the least RMSE and MSE values, the ReLu function is more superior in learning the high traffic pattern with a learning rate of 0.9. In addition, Re-LU is more powerful to be used in the first order in terms of combination. Hence, combining a high learning rate and pure ReLU, ReLu-sigmoid, or ReLu-Tanh is more suitable and recommended to predict upper traffic utilization