Finding Optimal Bayesian Networks with Local Structure
[edit]
Proceedings of the Ninth International Conference on Probabilistic Graphical Models, PMLR 72:451462, 2018.
Abstract
The idea of using decision trees as local models in Bayesian networks is revisited. A class of dyadic decision trees—proposed previously only for continuous conditioning variables—is augmented by incorporating categorical variables with arbitrary contextspecific recursive splitting of their state spaces. It is shown that the resulting model class admits computationally feasible maximization of a Bayes score in a range of moderatesize problem instances. In particular, it enables global optimization of the Bayesian network structure, including the local structure, using stateoftheart exact algorithms. The paper also introduces a related model class that extends ordinary conditional probability tables to continuous variables by employing an adaptive discretization approach. The two model classes are compared empirically by learning Bayesian networks from benchmark realworld and synthetic data sets. The relative strengths of the model classes are discussed.
Related Material


