Learning Bayesian Network Structure using LP Relaxations

Tommi Jaakkola, David Sontag, Amir Globerson, Marina Meila
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:358-365, 2010.

Abstract

We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-jaakkola10a, title = {Learning Bayesian Network Structure using LP Relaxations}, author = {Jaakkola, Tommi and Sontag, David and Globerson, Amir and Meila, Marina}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {358--365}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/jaakkola10a/jaakkola10a.pdf}, url = {https://proceedings.mlr.press/v9/jaakkola10a.html}, abstract = {We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.} }
Endnote
%0 Conference Paper %T Learning Bayesian Network Structure using LP Relaxations %A Tommi Jaakkola %A David Sontag %A Amir Globerson %A Marina Meila %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-jaakkola10a %I PMLR %P 358--365 %U https://proceedings.mlr.press/v9/jaakkola10a.html %V 9 %X We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming.
RIS
TY - CPAPER TI - Learning Bayesian Network Structure using LP Relaxations AU - Tommi Jaakkola AU - David Sontag AU - Amir Globerson AU - Marina Meila BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-jaakkola10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 358 EP - 365 L1 - http://proceedings.mlr.press/v9/jaakkola10a/jaakkola10a.pdf UR - https://proceedings.mlr.press/v9/jaakkola10a.html AB - We propose to solve the combinatorial problem of finding the highest scoring Bayesian network structure from data. This structure learning problem can be viewed as an inference problem where the variables specify the choice of parents for each node in the graph. The key combinatorial difficulty arises from the global constraint that the graph structure has to be acyclic. We cast the structure learning problem as a linear program over the polytope defined by valid acyclic structures. In relaxing this problem, we maintain an outer bound approximation to the polytope and iteratively tighten it by searching over a new class of valid constraints. If an integral solution is found, it is guaranteed to be the optimal Bayesian network. When the relaxation is not tight, the fast dual algorithms we develop remain useful in combination with a branch and bound method. Empirical results suggest that the method is competitive or faster than alternative exact methods based on dynamic programming. ER -
APA
Jaakkola, T., Sontag, D., Globerson, A. & Meila, M.. (2010). Learning Bayesian Network Structure using LP Relaxations. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:358-365 Available from https://proceedings.mlr.press/v9/jaakkola10a.html.

Related Material