Bayesian out-trees

Tony Jebara
Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, PMLR R6:315-324, 2008.

Abstract

A Bayesian treatment of latent directed graph structure for non-iid data is provided where each child datum is sampled with a directed conditional dependence on a single unknown parent datum. The latent graph structure is assumed to lie in the family of directed out-tree graphs which leads to efficient Bayesian inference. The latent likelihood of the data and its gradients are computable in closed form via Tutte’s directed matrix tree theorem using determinants and inverses of the out-Laplacian. This novel likelihood subsumes iid likelihood, is exchangeable and yields efficient unsupervised and semi-supervised learning algorithms. In addition to handling taxonomy and phylogenetic datasets the out-tree assumption performs surprisingly well as a semi-parametric density estimator on standard iid datasets. Experiments with unsupervised and semi-supervised learning are shown on various UCI and taxonomy datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR6-jebara08a, title = {Bayesian out-trees}, author = {Jebara, Tony}, booktitle = {Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence}, pages = {315--324}, year = {2008}, editor = {McAllester, David A. and Myllymäki, Petri}, volume = {R6}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/r6/main/assets/jebara08a/jebara08a.pdf}, url = {https://proceedings.mlr.press/r6/jebara08a.html}, abstract = {A Bayesian treatment of latent directed graph structure for non-iid data is provided where each child datum is sampled with a directed conditional dependence on a single unknown parent datum. The latent graph structure is assumed to lie in the family of directed out-tree graphs which leads to efficient Bayesian inference. The latent likelihood of the data and its gradients are computable in closed form via Tutte’s directed matrix tree theorem using determinants and inverses of the out-Laplacian. This novel likelihood subsumes iid likelihood, is exchangeable and yields efficient unsupervised and semi-supervised learning algorithms. In addition to handling taxonomy and phylogenetic datasets the out-tree assumption performs surprisingly well as a semi-parametric density estimator on standard iid datasets. Experiments with unsupervised and semi-supervised learning are shown on various UCI and taxonomy datasets.}, note = {Reissued by PMLR on 09 October 2024.} }
Endnote
%0 Conference Paper %T Bayesian out-trees %A Tony Jebara %B Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2008 %E David A. McAllester %E Petri Myllymäki %F pmlr-vR6-jebara08a %I PMLR %P 315--324 %U https://proceedings.mlr.press/r6/jebara08a.html %V R6 %X A Bayesian treatment of latent directed graph structure for non-iid data is provided where each child datum is sampled with a directed conditional dependence on a single unknown parent datum. The latent graph structure is assumed to lie in the family of directed out-tree graphs which leads to efficient Bayesian inference. The latent likelihood of the data and its gradients are computable in closed form via Tutte’s directed matrix tree theorem using determinants and inverses of the out-Laplacian. This novel likelihood subsumes iid likelihood, is exchangeable and yields efficient unsupervised and semi-supervised learning algorithms. In addition to handling taxonomy and phylogenetic datasets the out-tree assumption performs surprisingly well as a semi-parametric density estimator on standard iid datasets. Experiments with unsupervised and semi-supervised learning are shown on various UCI and taxonomy datasets. %Z Reissued by PMLR on 09 October 2024.
APA
Jebara, T.. (2008). Bayesian out-trees. Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research R6:315-324 Available from https://proceedings.mlr.press/r6/jebara08a.html. Reissued by PMLR on 09 October 2024.

Related Material