IEEE Access (Jan 2020)
Model Weighting for One-Dependence Estimators by Measuring the Independence Assumptions
Abstract
The superparent one-dependence estimators (SPODEs) is a popular family of semi-naive Bayesian network classifiers, and the averaged one-dependence estimators (AODE) provides efficient single pass learning with competitive classification accuracy. All the SPODEs in AODE are treated equally and have the same weight. Researchers have proposed to apply information-theoretic metrics, such as mutual information or conditional log likelihood, for assigning discriminative weights. However, while dealing with different instances the independence assumptions for different SPODEs may hold to different extents. The quest for highly scalable learning algorithms is urgent to approximate the ground-truth attribute dependencies that are implicit in training data. In this study we set each instance as the target and investigate extensions to AODE by measuring the independence assumption of SPODEs and assigning weights. The proposed approach, called independence weighted AODE (IWAODE), is validated on 40 benchmark datasets from the UCI machine learning repository. Experimental results reveal that, the resulting weighted SPODEs delivers computationally efficient low-bias learning, proving to be a competitive alternative to state-of-the-art single and ensemble Bayesian network classifiers (such as tree-augmented naive Bayes, k-dependence Bayesian classifier, WAODE-MI and etc).
Keywords