IEEE Access (Jan 2021)
A Study of Using Bethe/Kikuchi Approximation for Learning Directed Graphic Models
Abstract
This paper applies the variational methods to learn the parameters and the probability of evidence of directed graphic models (also known as Bayesian networks (BNs)) when data contains missing values. One class of variational methods, the Bethe/Kikuchi approximate algorithm, is combined with Expectation-Maximization (EM) to learn BN model parameters from data using an upper limit on cluster size. Our proposed method is novel since the Bethe/Kikuchi algorithms are typically only used to approximate marginal distributions. We review popular Bethe/Kikuchi approximate algorithms, including Belief Propagation (BP), Iterative Join Graph Propagation (IJGP), and Fundamental Cycle Base (FCB) algorithms for discrete BNs, and conduct experiments to test their learning performance using a wide range of discrete BNs in practice. Experiments show that by using IJGP or FCB algorithms, with EM, we can obtain more accurate results than the conventional approximate algorithms, such as Evidence Pre-propagation Importance Sampling and Variational Inference. When exact methods are intractable the IJGP/FCB with EM can be an effective alternative solution.
Keywords