Consistent Learning Bayesian Networks with Thousands of Variables

Kazuki Natori, Masaki Uto, Maomi Ueno
Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, PMLR 73:57-68, 2017.

Abstract

We have already proposed a constraint-based learning Bayesian network method using Bayes factor. Since a conditional independence test using Bayes factor has consistency, the learning method improves the learning accuracy of the traditional constraint-based learning methods. Additionally, the method is expected to learn larger network structures than the traditional methods do because it greatly improves computational efficiency. However, its expected benefits have not been demonstrated empirically. This report describes some experiments related to the learning of large network structures. Results show that the proposed method can learn surprisingly huge networks with thousands of variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v73-natori17a, title = {Consistent Learning Bayesian Networks with Thousands of Variables}, author = {Natori, Kazuki and Uto, Masaki and Ueno, Maomi}, booktitle = {Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks}, pages = {57--68}, year = {2017}, editor = {Hyttinen, Antti and Suzuki, Joe and Malone, Brandon}, volume = {73}, series = {Proceedings of Machine Learning Research}, month = {20--22 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v73/natori17a/natori17a.pdf}, url = {https://proceedings.mlr.press/v73/natori17a.html}, abstract = {We have already proposed a constraint-based learning Bayesian network method using Bayes factor. Since a conditional independence test using Bayes factor has consistency, the learning method improves the learning accuracy of the traditional constraint-based learning methods. Additionally, the method is expected to learn larger network structures than the traditional methods do because it greatly improves computational efficiency. However, its expected benefits have not been demonstrated empirically. This report describes some experiments related to the learning of large network structures. Results show that the proposed method can learn surprisingly huge networks with thousands of variables. } }
Endnote
%0 Conference Paper %T Consistent Learning Bayesian Networks with Thousands of Variables %A Kazuki Natori %A Masaki Uto %A Maomi Ueno %B Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks %C Proceedings of Machine Learning Research %D 2017 %E Antti Hyttinen %E Joe Suzuki %E Brandon Malone %F pmlr-v73-natori17a %I PMLR %P 57--68 %U https://proceedings.mlr.press/v73/natori17a.html %V 73 %X We have already proposed a constraint-based learning Bayesian network method using Bayes factor. Since a conditional independence test using Bayes factor has consistency, the learning method improves the learning accuracy of the traditional constraint-based learning methods. Additionally, the method is expected to learn larger network structures than the traditional methods do because it greatly improves computational efficiency. However, its expected benefits have not been demonstrated empirically. This report describes some experiments related to the learning of large network structures. Results show that the proposed method can learn surprisingly huge networks with thousands of variables.
APA
Natori, K., Uto, M. & Ueno, M.. (2017). Consistent Learning Bayesian Networks with Thousands of Variables. Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, in Proceedings of Machine Learning Research 73:57-68 Available from https://proceedings.mlr.press/v73/natori17a.html.

Related Material