Cauchy Graphical Models

Taurai Muvunza, Yang Li, Kuruoglu Ercan Engin
Proceedings of The 12th International Conference on Probabilistic Graphical Models, PMLR 246:528-542, 2024.

Abstract

A common approach to learning Bayesian networks involves specifying an appropriately chosen family of parameterized probability density such as Gaussian. However, the distribution of most real-life data is leptokurtic and may not necessarily be best described by a Gaussian process. In this work we introduce Cauchy Graphical Models (CGM), a class of multivariate Cauchy densities that can be represented as directed acyclic graphs with arbitrary network topologies, the edges of which encode linear dependencies between random variables. We develop CGLearn, the resultant algorithm for learning the structure and Cauchy parameters based on Minimum Dispersion Criterion (MDC). Experiments using simulated datasets on benchmark network topologies demonstrate the efficacy of our approach when compared to Gaussian Graphical Models (GGM).

Cite this Paper


BibTeX
@InProceedings{pmlr-v246-muvunza24a, title = {Cauchy Graphical Models}, author = {Muvunza, Taurai and Li, Yang and Ercan, Engin, Kuruoglu}, booktitle = {Proceedings of The 12th International Conference on Probabilistic Graphical Models}, pages = {528--542}, year = {2024}, editor = {Kwisthout, Johan and Renooij, Silja}, volume = {246}, series = {Proceedings of Machine Learning Research}, month = {11--13 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v246/main/assets/muvunza24a/muvunza24a.pdf}, url = {https://proceedings.mlr.press/v246/muvunza24a.html}, abstract = {A common approach to learning Bayesian networks involves specifying an appropriately chosen family of parameterized probability density such as Gaussian. However, the distribution of most real-life data is leptokurtic and may not necessarily be best described by a Gaussian process. In this work we introduce Cauchy Graphical Models (CGM), a class of multivariate Cauchy densities that can be represented as directed acyclic graphs with arbitrary network topologies, the edges of which encode linear dependencies between random variables. We develop CGLearn, the resultant algorithm for learning the structure and Cauchy parameters based on Minimum Dispersion Criterion (MDC). Experiments using simulated datasets on benchmark network topologies demonstrate the efficacy of our approach when compared to Gaussian Graphical Models (GGM).} }
Endnote
%0 Conference Paper %T Cauchy Graphical Models %A Taurai Muvunza %A Yang Li %A Kuruoglu Ercan, Engin %B Proceedings of The 12th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2024 %E Johan Kwisthout %E Silja Renooij %F pmlr-v246-muvunza24a %I PMLR %P 528--542 %U https://proceedings.mlr.press/v246/muvunza24a.html %V 246 %X A common approach to learning Bayesian networks involves specifying an appropriately chosen family of parameterized probability density such as Gaussian. However, the distribution of most real-life data is leptokurtic and may not necessarily be best described by a Gaussian process. In this work we introduce Cauchy Graphical Models (CGM), a class of multivariate Cauchy densities that can be represented as directed acyclic graphs with arbitrary network topologies, the edges of which encode linear dependencies between random variables. We develop CGLearn, the resultant algorithm for learning the structure and Cauchy parameters based on Minimum Dispersion Criterion (MDC). Experiments using simulated datasets on benchmark network topologies demonstrate the efficacy of our approach when compared to Gaussian Graphical Models (GGM).
APA
Muvunza, T., Li, Y. & Ercan, Engin, K.. (2024). Cauchy Graphical Models. Proceedings of The 12th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 246:528-542 Available from https://proceedings.mlr.press/v246/muvunza24a.html.

Related Material