Symmetry (Jun 2022)
A Modified Stein Variational Inference Algorithm with Bayesian and Gradient Descent Techniques
Abstract
This paper introduces a novel variational inference (VI) method with Bayesian and gradient descent techniques. To facilitate the approximation of the posterior distributions for the parameters of the models, the Stein method has been used in Bayesian variational inference algorithms in recent years. Unfortunately, previous methods fail to either explicitly describe the influence of its history in the tracing of particles (Q(x) in this paper) in the approximation, which is important information in the search for particles. In our paper, Q(x) is considered in design of the operator Bp, but the chance of jumping out of the local optimum may be increased, especially in the case of complex distribution. To address the existing issues, a modified Stein variational inference algorithm is proposed, which can make the gradient descent of Kullback–Leibler (KL) divergence more random. In our method, a group of particles are used to approximate target distribution by minimizing the KL divergence, which changes according to the newly defined kernelized Stein discrepancy. Furthermore, the usefulness of the suggested technique is demonstrated by using four data sets. Bayesian logistic regression is considered for classification. Statistical studies such as parameter estimate classification accuracy, F1, NRMSE, and others are used to validate the algorithm’s performance.
Keywords