Consistent Learning Bayesian Networks with Thousands of Variables

Kazuki Natori, Masaki Uto, Maomi Ueno
; Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, PMLR 73:57-68, 2017.

Abstract

We have already proposed a constraint-based learning Bayesian network method using Bayes factor. Since a conditional independence test using Bayes factor has consistency, the learning method improves the learning accuracy of the traditional constraint-based learning methods. Additionally, the method is expected to learn larger network structures than the traditional methods do because it greatly improves computational efficiency. However, its expected benefits have not been demonstrated empirically. This report describes some experiments related to the learning of large network structures. Results show that the proposed method can learn surprisingly huge networks with thousands of variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v73-natori17a, title = {Consistent Learning Bayesian Networks with Thousands of Variables}, author = {Kazuki Natori and Masaki Uto and Maomi Ueno}, booktitle = {Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks}, pages = {57--68}, year = {2017}, editor = {Antti Hyttinen and Joe Suzuki and Brandon Malone}, volume = {73}, series = {Proceedings of Machine Learning Research}, month = {20--22 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v73/natori17a/natori17a.pdf}, url = {http://proceedings.mlr.press/v73/natori17a.html}, abstract = {We have already proposed a constraint-based learning Bayesian network method using Bayes factor. Since a conditional independence test using Bayes factor has consistency, the learning method improves the learning accuracy of the traditional constraint-based learning methods. Additionally, the method is expected to learn larger network structures than the traditional methods do because it greatly improves computational efficiency. However, its expected benefits have not been demonstrated empirically. This report describes some experiments related to the learning of large network structures. Results show that the proposed method can learn surprisingly huge networks with thousands of variables. } }
Endnote
%0 Conference Paper %T Consistent Learning Bayesian Networks with Thousands of Variables %A Kazuki Natori %A Masaki Uto %A Maomi Ueno %B Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks %C Proceedings of Machine Learning Research %D 2017 %E Antti Hyttinen %E Joe Suzuki %E Brandon Malone %F pmlr-v73-natori17a %I PMLR %J Proceedings of Machine Learning Research %P 57--68 %U http://proceedings.mlr.press %V 73 %W PMLR %X We have already proposed a constraint-based learning Bayesian network method using Bayes factor. Since a conditional independence test using Bayes factor has consistency, the learning method improves the learning accuracy of the traditional constraint-based learning methods. Additionally, the method is expected to learn larger network structures than the traditional methods do because it greatly improves computational efficiency. However, its expected benefits have not been demonstrated empirically. This report describes some experiments related to the learning of large network structures. Results show that the proposed method can learn surprisingly huge networks with thousands of variables.
APA
Natori, K., Uto, M. & Ueno, M.. (2017). Consistent Learning Bayesian Networks with Thousands of Variables. Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, in PMLR 73:57-68

Related Material