MethodsX (Jun 2024)

Integrating Dropout and Kullback-Leibler Regularization in Bayesian Neural Networks for improved uncertainty estimation in Regression

  • Raghavendra M. Devadas,
  • Vani Hiremani

Journal volume & issue
Vol. 12
p. 102659

Abstract

Read online

The objective of the study is to enhance uncertainty prediction in regression problems by introducing a revolutionary Bayesian Neural Network (BNN) model. Experimental results reveal significant improvements in uncertainty prediction and point forecasts with the integrated BNN model compared to the plain BNN. Performance metrics, including mean squared error (MSE), mean absolute error (MAE), and R-squared (R²), demonstrate superior results for the proposed BNN. The experimental results show that for plain BNN, MSE is 87.3, MAE is 6.62 and R2 is −0.0492, whereas for the proposed BNN model MSE was found to be 44.64, MAE is 4.4 and R2 is 0.46. This research brings a fresh approach to Bayesian Neural Networks by incorporating both dropout and KL regularization techniques, resulting in a powerful tool for handling regression tasks with certainty. By combining these techniques, the study enhances model stability, avoids overfitting, and achieves more reliable uncertainty estimation. This study adds to our knowledge of uncertainty-aware machine learning models and offers a valuable solution for accurately assessing uncertainty in various applications. • The innovative BNN model merges the power of Bayesian principles with the effectiveness of dropout and KL regularization. • To test and refine our model, the study utilizes the Boston Housing dataset for both training and evaluation purposes.

Keywords