[edit]

# Finding Optimal Bayesian Networks with Local Structure

; Proceedings of the Ninth International Conference on Probabilistic Graphical Models, PMLR 72:451-462, 2018.

#### Abstract

The idea of using decision trees as local models in Bayesian networks is revisited. A class of dyadic decision trees—proposed previously only for continuous conditioning variables—is augmented by incorporating categorical variables with arbitrary context-specific recursive splitting of their state spaces. It is shown that the resulting model class admits computationally feasible maximization of a Bayes score in a range of moderate-size problem instances. In particular, it enables global optimization of the Bayesian network structure, including the local structure, using state-of-the-art exact algorithms. The paper also introduces a related model class that extends ordinary conditional probability tables to continuous variables by employing an adaptive discretization approach. The two model classes are compared empirically by learning Bayesian networks from benchmark real-world and synthetic data sets. The relative strengths of the model classes are discussed.