Exact Inference on Conditional Linear Γ-Gaussian Bayesian Networks

Ivar Simonsson, Petter Mostad
Proceedings of the Eighth International Conference on Probabilistic Graphical Models, PMLR 52:474-486, 2016.

Abstract

Exact inference for Bayesian Networks is only possible for quite limited classes of networks. Examples of such classes are discrete networks, conditional linear Gaussian networks, networks using mixtures of truncated exponentials, and networks with densities expressed as truncated polynomials. This paper defines another class with exact inference, based on the normal inverse gamma conjugacy. We describe the theory of this class as well as exemplify our implemented inference algorithm in a practical example. Although generally small and simple, we believe these kinds of networks are potentially quite useful, on their own or in combination with other algorithms and methods for Bayesian Network inference.

Cite this Paper


BibTeX
@InProceedings{pmlr-v52-simonsson16, title = {Exact Inference on Conditional Linear {$\Gamma$}-{G}aussian {B}ayesian Networks}, author = {Simonsson, Ivar and Mostad, Petter}, booktitle = {Proceedings of the Eighth International Conference on Probabilistic Graphical Models}, pages = {474--486}, year = {2016}, editor = {Antonucci, Alessandro and Corani, Giorgio and Campos}, Cassio Polpo}, volume = {52}, series = {Proceedings of Machine Learning Research}, address = {Lugano, Switzerland}, month = {06--09 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v52/simonsson16.pdf}, url = {https://proceedings.mlr.press/v52/simonsson16.html}, abstract = {Exact inference for Bayesian Networks is only possible for quite limited classes of networks. Examples of such classes are discrete networks, conditional linear Gaussian networks, networks using mixtures of truncated exponentials, and networks with densities expressed as truncated polynomials. This paper defines another class with exact inference, based on the normal inverse gamma conjugacy. We describe the theory of this class as well as exemplify our implemented inference algorithm in a practical example. Although generally small and simple, we believe these kinds of networks are potentially quite useful, on their own or in combination with other algorithms and methods for Bayesian Network inference.} }
Endnote
%0 Conference Paper %T Exact Inference on Conditional Linear Γ-Gaussian Bayesian Networks %A Ivar Simonsson %A Petter Mostad %B Proceedings of the Eighth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2016 %E Alessandro Antonucci %E Giorgio Corani %E Cassio Polpo Campos} %F pmlr-v52-simonsson16 %I PMLR %P 474--486 %U https://proceedings.mlr.press/v52/simonsson16.html %V 52 %X Exact inference for Bayesian Networks is only possible for quite limited classes of networks. Examples of such classes are discrete networks, conditional linear Gaussian networks, networks using mixtures of truncated exponentials, and networks with densities expressed as truncated polynomials. This paper defines another class with exact inference, based on the normal inverse gamma conjugacy. We describe the theory of this class as well as exemplify our implemented inference algorithm in a practical example. Although generally small and simple, we believe these kinds of networks are potentially quite useful, on their own or in combination with other algorithms and methods for Bayesian Network inference.
RIS
TY - CPAPER TI - Exact Inference on Conditional Linear Γ-Gaussian Bayesian Networks AU - Ivar Simonsson AU - Petter Mostad BT - Proceedings of the Eighth International Conference on Probabilistic Graphical Models DA - 2016/08/15 ED - Alessandro Antonucci ED - Giorgio Corani ED - Cassio Polpo Campos} ID - pmlr-v52-simonsson16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 52 SP - 474 EP - 486 L1 - http://proceedings.mlr.press/v52/simonsson16.pdf UR - https://proceedings.mlr.press/v52/simonsson16.html AB - Exact inference for Bayesian Networks is only possible for quite limited classes of networks. Examples of such classes are discrete networks, conditional linear Gaussian networks, networks using mixtures of truncated exponentials, and networks with densities expressed as truncated polynomials. This paper defines another class with exact inference, based on the normal inverse gamma conjugacy. We describe the theory of this class as well as exemplify our implemented inference algorithm in a practical example. Although generally small and simple, we believe these kinds of networks are potentially quite useful, on their own or in combination with other algorithms and methods for Bayesian Network inference. ER -
APA
Simonsson, I. & Mostad, P.. (2016). Exact Inference on Conditional Linear Γ-Gaussian Bayesian Networks. Proceedings of the Eighth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 52:474-486 Available from https://proceedings.mlr.press/v52/simonsson16.html.

Related Material