Parallel Bayesian Network Structure Learning

Tian Gao, Dennis Wei
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1685-1694, 2018.

Abstract

Recent advances in Bayesian Network (BN) structure learning have focused on local-to-global learning, where the graph structure is learned via one local subgraph at a time. As a natural progression, we investigate parallel learning of BN structures via multiple learning agents simultaneously, where each agent learns one local subgraph at a time. We find that parallel learning can reduce the number of subgraphs requiring structure learning by storing previously queried results and communicating (even partial) results among agents. More specifically, by using novel rules on query subset and superset inference, many subgraph structures can be inferred without learning. We provide a sound and complete parallel structure learning (PSL) algorithm, and demonstrate its improved efficiency over state-of-the-art single-thread learning algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-gao18b, title = {Parallel {B}ayesian Network Structure Learning}, author = {Gao, Tian and Wei, Dennis}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {1685--1694}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/gao18b/gao18b.pdf}, url = {https://proceedings.mlr.press/v80/gao18b.html}, abstract = {Recent advances in Bayesian Network (BN) structure learning have focused on local-to-global learning, where the graph structure is learned via one local subgraph at a time. As a natural progression, we investigate parallel learning of BN structures via multiple learning agents simultaneously, where each agent learns one local subgraph at a time. We find that parallel learning can reduce the number of subgraphs requiring structure learning by storing previously queried results and communicating (even partial) results among agents. More specifically, by using novel rules on query subset and superset inference, many subgraph structures can be inferred without learning. We provide a sound and complete parallel structure learning (PSL) algorithm, and demonstrate its improved efficiency over state-of-the-art single-thread learning algorithms.} }
Endnote
%0 Conference Paper %T Parallel Bayesian Network Structure Learning %A Tian Gao %A Dennis Wei %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-gao18b %I PMLR %P 1685--1694 %U https://proceedings.mlr.press/v80/gao18b.html %V 80 %X Recent advances in Bayesian Network (BN) structure learning have focused on local-to-global learning, where the graph structure is learned via one local subgraph at a time. As a natural progression, we investigate parallel learning of BN structures via multiple learning agents simultaneously, where each agent learns one local subgraph at a time. We find that parallel learning can reduce the number of subgraphs requiring structure learning by storing previously queried results and communicating (even partial) results among agents. More specifically, by using novel rules on query subset and superset inference, many subgraph structures can be inferred without learning. We provide a sound and complete parallel structure learning (PSL) algorithm, and demonstrate its improved efficiency over state-of-the-art single-thread learning algorithms.
APA
Gao, T. & Wei, D.. (2018). Parallel Bayesian Network Structure Learning. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:1685-1694 Available from https://proceedings.mlr.press/v80/gao18b.html.

Related Material