Learning Non-parametric Markov Networks with Mutual Information

Janne Leppä-Aho, Santeri Räisänen, Xiao Yang, Teemu Roos
Proceedings of the Ninth International Conference on Probabilistic Graphical Models, PMLR 72:213-224, 2018.

Abstract

We propose a method for learning Markov network structures for continuous data without assuming any particular parametric distribution for the variables. The method makes use of previous work on a non-parametric estimator for mutual information which is used to create a non-parametric test for multivariate conditional independence. This independence test is then combined with an efficient constraint-based algorithm for learning the graph structure. The performance of the method is evaluated on several synthetic data sets and it is shown to learn more accurate structures than competing methods when the dependencies between the variables involve non-linearities.

Cite this Paper


BibTeX
@InProceedings{pmlr-v72-leppa-aho18a, title = {Learning Non-parametric Markov Networks with Mutual Information}, author = {Lepp\"{a}-Aho, Janne and R\"{a}is\"{a}nen, Santeri and Yang, Xiao and Roos, Teemu}, booktitle = {Proceedings of the Ninth International Conference on Probabilistic Graphical Models}, pages = {213--224}, year = {2018}, editor = {Kratochvíl, Václav and Studený, Milan}, volume = {72}, series = {Proceedings of Machine Learning Research}, month = {11--14 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v72/leppa-aho18a/leppa-aho18a.pdf}, url = {https://proceedings.mlr.press/v72/leppa-aho18a.html}, abstract = {We propose a method for learning Markov network structures for continuous data without assuming any particular parametric distribution for the variables. The method makes use of previous work on a non-parametric estimator for mutual information which is used to create a non-parametric test for multivariate conditional independence. This independence test is then combined with an efficient constraint-based algorithm for learning the graph structure. The performance of the method is evaluated on several synthetic data sets and it is shown to learn more accurate structures than competing methods when the dependencies between the variables involve non-linearities.} }
Endnote
%0 Conference Paper %T Learning Non-parametric Markov Networks with Mutual Information %A Janne Leppä-Aho %A Santeri Räisänen %A Xiao Yang %A Teemu Roos %B Proceedings of the Ninth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2018 %E Václav Kratochvíl %E Milan Studený %F pmlr-v72-leppa-aho18a %I PMLR %P 213--224 %U https://proceedings.mlr.press/v72/leppa-aho18a.html %V 72 %X We propose a method for learning Markov network structures for continuous data without assuming any particular parametric distribution for the variables. The method makes use of previous work on a non-parametric estimator for mutual information which is used to create a non-parametric test for multivariate conditional independence. This independence test is then combined with an efficient constraint-based algorithm for learning the graph structure. The performance of the method is evaluated on several synthetic data sets and it is shown to learn more accurate structures than competing methods when the dependencies between the variables involve non-linearities.
APA
Leppä-Aho, J., Räisänen, S., Yang, X. & Roos, T.. (2018). Learning Non-parametric Markov Networks with Mutual Information. Proceedings of the Ninth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 72:213-224 Available from https://proceedings.mlr.press/v72/leppa-aho18a.html.

Related Material